ArticlePDF Available

Legal Framework for Small Autonomous Agricultural Robots

Authors:
  • Risktec Solutions Ltd

Abstract and Figures

Legal structures may form barriers to, or enablers of, adoption of precision agriculture management with small autonomous agricultural robots. This article develops a conceptual regulatory framework for small autonomous agricultural robots, from a practical, self-contained engineering guide perspective, sufficient to get working research and commercial agricultural roboticists quickly and easily up and running within the law. The article examines the liability framework, or rather lack of it, for agricultural robotics in EU, and their transpositions to UK law, as a case study illustrating general international legal concepts and issues. It examines how the law may provide mitigating effects on the liability regime, and how contracts can be developed between agents within it to enable smooth operation. It covers other legal aspects of operation such as the use of shared communications resources and privacy in the reuse of robot-collected data. Where there are some grey areas in current law, it argues that new proposals could be developed to reform these to promote further innovation and investment in agricultural robots.
This content is subject to copyright. Terms and conditions apply.
Vol.:(0123456789)
1 3
AI & SOCIETY (2020) 35:113–134
https://doi.org/10.1007/s00146-018-0846-4
ORIGINAL ARTICLE
Legal framework forsmall autonomous agricultural robots
SubhajitBasu1· AdekemiOmotubora2· MattBeeson3· CharlesFox4,5
Received: 16 November 2017 / Accepted: 24 April 2018 / Published online: 8 May 2018
© The Author(s) 2018
Abstract
Legal structures may form barriers to, or enablers of, adoption of precision agriculture management with small autonomous
agricultural robots. This article develops a conceptual regulatory framework for small autonomous agricultural robots, from
a practical, self-contained engineering guide perspective, sufficient to get working research and commercial agricultural
roboticists quickly and easily up and running within the law. The article examines the liability framework, or rather lack of
it, for agricultural robotics in EU, and their transpositions to UK law, as a case study illustrating general international legal
concepts and issues. It examines how the law may provide mitigating effects on the liability regime, and how contracts can
be developed between agents within it to enable smooth operation. It covers other legal aspects of operation such as the use
of shared communications resources and privacy in the reuse of robot-collected data. Where there are some grey areas in
current law, it argues that new proposals could be developed to reform these to promote further innovation and investment
in agricultural robots.
Keywords Agriculture· Robotics· Legal· Safety agribot
1 Introduction
Self-driving vehicles are rapidly arriving both on (Guizzo
2011) and off (Blackmore etal. 2004) roads. In the agri-
cultural setting, technology has progressed from tractor
driver-assistive systems such as RTK-GPS displays, to fully
autonomous, self-driving platforms capable of carrying out
agricultural tasks with no human intervention (Pedersen etal.
2006). While legal aspects of on-road autonomous vehicles
have been well studied (Beiker 2012; Anderson etal. 2014,
Pinto etal. 2012; Douma and Palodichuk 2012; Brodsky
2016), there is a need for an analogous understanding of off-
road agricultural vehicles’ legal positions, despite the forecast
for the agri-robotics section to reach 5.7bn USD by the year
2024 (Transparency Market Research 2017). The present
study reviews the relevant legal frameworks from a practical
engineering implementer of agricultural robotics technolo-
gies to fill this need. It is intended as a self-contained guide
for practising engineers to find all the information needed to
get their autonomous agricultural robotic research systems
up and running, quickly and easily within the law. As such it
does not represent the formal legal advice, which should be
taken in addition to the overview given here.
Autonomous agricultural vehicles have been developed
in two broad classes: automated large tractors and smaller
(e.g., < 1tonne) precision robots. Automated tractor systems
have been developed (Ishida etal. 1998; Michio etal. 2002;
Blackmore etal. 2004; Dvorak 2016) based on existing man-
ual-drive tractors, which already have commercially avail-
able high precision GPS guidance. These systems compute
paths to swathe fields, typically in rows with headland turns.
In some cases, this guidance consists of telling the human
operator precisely what angle to turn the steering wheel at
each second (e.g., Trimble EZ-Guide Lightbar, http://www.
trimb le.com); others show deviation from the computed
path and leave the human driver to correct for it. Automated
tractors typically aim to perform the same type of work as
manual-drive tractors, namely bulk operations across whole
fields, such as seeding, spraying and harvesting of row crops.
* Charles Fox
charles.fox@gmail.com
1 School ofLaw, University ofLeeds, Leeds, UK
2 Faculty ofLaw, University ofLagos, Lagos, Nigeria
3 Risktec Solutions Ltd, Part oftheTUV Rheinland Group,
Berlin, Germany
4 School ofComputer Science, University ofLincoln, Lincoln,
UK
5 Ibex Automation Ltd, Sheffield, UK
Content courtesy of Springer Nature, terms of use apply. Rights reserved.
114 AI & SOCIETY (2020) 35:113–134
1 3
In contrast, small autonomous robots for agriculture
(“agribots”, Fig.1) have focused on precision applica-
tions. Large vehicles are required for bulk operations due
to the need for physically transporting bulk materials such
as seeds, fertiliser and produce. Small robots make up for
reduced bulk transport capability by aiming, ultimately,
to work on a per-plant basis. This enables them to trans-
port smaller amounts of more targetted materials, includ-
ing reduced herbicide doses to apply to individual weeds
(Binch and Fox 2017), detect the fertiliser needs of and
apply fertilisers to (Singh and Shaligram 2014) individual
plants; and harvesting of plants (Bac etal. 2014) when
they are individually optimally ready for consumption.
In some cases, precision systems are also found on
large tractors, with variable rate controls uses to make
bulk operations more precise (Escola etal. 2013).
The legal implications of these technologies are differ-
ent from those of on-road self-driving vehicles. On-road
vehicle operations take place in public places—high-
ways—which are governed by highways legislation. In
contrast, most agricultural robots are intended to operate
on privately owned agricultural land, governed by differ-
ent business, agricultural and environmental laws. How-
ever, such land is not free from interactions with humans,
whose safety and legal positions must be considered. Farm
owners, managers and workers may be present as well as
walkers on public footpaths and trespassers. In the event
of accidents, the roles of owners, managers, manufactur-
ers and designers of systems must be considered. Existing
legal, environmental restrictions and responsibilities must
be taken into account—damage caused to the environment
is a greater concern than in the on-road case, including the
application of chemicals and damage to crops and soils.
1.1 Overview
In these respects, this article addresses three questions:
What is the legal regime on the liability of manufacturers,
suppliers and users of autonomous robots in the UK/EU?
Does the law provide any mitigation of liability which
could promote innovation in autonomous robots? Apart
from the law in the UK/EU, what are the current debates
and legal outlook on robotics and how can these shape the
law in the area of small autonomous agricultural vehicles?
After brief introductions to engineering for lawyers and
law for engineers, the Sect.2 of the article examines the
liability framework which applies to autonomous robots
given lack of a separate or specific framework for robot-
ics in the UK and the EU. Section3 considers how the
law may provide mitigating effects on the liability regime.
These parts are intended for use by practising roboticists in
need of a self-contained guide to their legal environment.
Section4 presents the debates on grey areas in the law
and proposals which may be adopted to reform the law and
promote future innovation and investment in small autono-
mous agricultural vehicles. This part is intended both for
use by policymakers and as a demarcation of potentially
dangerous uncertain legal areas for practising roboticists.
The article concludes that the law could facilitate inno-
vation in the agribots for the following reasons: The legal
framework for autonomous robots cuts across different laws.
Therefore, liability could be shared or distributed among
different parties to a contract for the use or operation of an
agribot. All legislation which imposes liabilities also pro-
vides corresponding defences which may aid the avoidance
of liability or the mitigation of damages. There are in par-
ticular specific defences which address the peculiarities of
technologies. Contracts can be used to define the rights and
obligations of respective parties, and unless the law other-
wise specifies, the contract can exclude liabilities for specific
claims. Courts are legally obliged to consider the utility and
social and economic value of an activity in awarding dam-
ages for loss and injury. Current debates suggest an apprecia-
tion of the new challenges posed by innovations in robotics
for law and policy. These can facilitate resolution and legal
intervention in the grey areas surrounding the legal frame-
work for small autonomous agricultural vehicles.
1.2 Basic self‑driving technology concepts (for
lawyers)
All vehicle automation systems are in practice probabil-
istic in their behaviour to some extent. Modern machine
navigation (Thrun 2005) and object recognition systems
use Bayesian probability frameworks (Bishop 2006).
Fig. 1 Example of an autonomous small robot for agriculture (agri-
bot). This agribot weights 250kg and precision spray weeds on hill
farms. (Photo: Ibex Automation Ltd.)
Content courtesy of Springer Nature, terms of use apply. Rights reserved.
115AI & SOCIETY (2020) 35:113–134
1 3
Probabilities appear in these models in two distinct ways.
First, the models assume precise models of the probabil-
istic distribution of sensory features given states of the
world. By itself, this assumes that the world is random and
probabilistic, though the probabilities in the equations can,
in theory, be manipulated precisely and deterministically.
Outdoor environments, weather, and the complexity of
plant biology and animal behaviours ensure that the world
is indeed random for all practical purposes—this contrasts
with other robotics applications such as food processing
production lines where the environment and produce can
be tightly controlled and standardised (Chua etal. 2003).
However, second, the Bayesian inference is known to be
computationally intractable in general (Cooper 1990). This
means that system designers can work only with approxi-
mation algorithms. Some of these approximations are
deterministic such as Variational Bayes (Fox and Roberts
2012), while popular Monte Carlo approximations use
random number generation as a seed for stochastic sam-
pling (Andrieu etal. 2003). Stochastic methods do not
have deterministic behaviour, though they converge to
exact answers in the theoretical case of infinite computa-
tion time.
The bayesian theory may further make use of prior infor-
mation in addition to real-time sensing (Bernardo and Smith
2001). This means that the perception of a state, and action
selection based on it, can be determined not just by current
inputs but also by assumptions and/or observations from
the past about similar states. In the on-road case, historical
data might show that other road users of particular demo-
graphics have statically predictive tendencies to behave in
certain ways during interactions with the autonomous vehi-
cle. Statistically, making use of such information as well
as real-time sensors is optimal for decision-making. How-
ever, the ethics of doing so are controversial. Use of prior
information is expressly prohibited in most legal systems,
even though it is known to give more accurate judgements
(Levitt and Laskey 2000). For off-road agricultural vehicles,
such human interactions are of less concern, but similar
questions about the use of priors may arise. A vehicle may
behave in ways unexpected by its owner or operator if its
designers have programmed it with different prior expecta-
tions that those of the owner or operator. For example, a
weed spraying robot designer might assume that weeds are a
priori more probable to be found near walls than in the open
field, but a farmer’s particular field might have all the weeds
in the centre, leading to the farmer complaining about poor
quality spraying decisions. Rather than made such assump-
tions manually, the designer may have the system learn from
data. The designer can collect historical data and analysed
before use of the vehicle, to inform and fix the priors. As
with manual assumptions, the choice of this data is essen-
tial and will still reflect the designers’ assumptions about
what constitutes “typical” data. Again, if this differs from
the users’ assumptions, then problems may occur. In some
systems, the learning from data process may continue after
the sale and use of the vehicles, with algorithms updating
their priors to include observations from the user’s runs,
including data from the present day’s work right up to the
present decision time. In this case, the prior information
may now include a mixture of the designers directly pro-
grammed assumption, the designer’s historical data, and the
user’s data, which has previously been identified as a legal
problem (Beck 2016).
As well as use for training priors, data collected from
users’ farms during operation is valuable for analysis. For
example, yield maps (Blackmore etal. 2003) can contain
information not only financially valuable to the farmer but
also to neighbouring and distant farms when used to pre-
dict trends and correlations. As with other “big data” sys-
tems such as in-car GPS route planning, which collects data
on drivers’ locations, questions arise about who owns this
data—the owner, operator or designer?
1.3 Basic legal concepts (for engineers)
The purpose of the law is to enable all stakeholders to get
along with one another in a society. This includes regulat-
ing how they should share scarce public resources, and how
they should handle externalities caused to one another as
side effects of their private contracts. For example, com-
municating with outdoor robots requires sharing of the radio
spectrum with other local users, whilst applying fertiliser
which runs off into a river may have negative externalities
to both the general public, who use it in their water supply,
and to individual neighbouring farmers.
There is no specific regulatory regime for agricultural
robots or for robots generally, and indeed it may be difficult
to have a single regulatory regime as robots differ on a num-
ber of criteria including functions, level of autonomy and
human–machine interaction.1 Therefore, liability could cut
across different areas of the law including tort, contract and
criminal laws, (as well as administrative actions) as shown
in Fig.2.
Fig. 2 Basic divisions of law
1 For example, there are service robots, military robots, toy robots
and so on.
Content courtesy of Springer Nature, terms of use apply. Rights reserved.
116 AI & SOCIETY (2020) 35:113–134
1 3
1.3.1 Torts
A tort is a civil wrong (committed by a person called the
tortfeasor) that results in loss or damage to another person,
and anyone who has suffered a loss as a result of another’s
civil wrong can bring an action for redress. For example, the
manufacturer or producer of a defective product can be held
liable for the tort of negligence if the product causes per-
sonal injury or other damage to the user. Therefore, because
the agribot is likely to be regarded as a product, its manufac-
turers/designers would be subject to laws regulating liabil-
ity for defective products. Also, agricultural contractors and
farmers as users or owners of the agribot, and their respec-
tive agents can be subject to different legal rules and statu-
tory provisions governing negligence, accidents and injury
to individuals as well as for loss of or damage to property.
1.3.2 Contracts
While a tort is a civil wrong entitling a party to sue the other
party for a breach of duty owed under the law, a contract is
an agreement which the law would enforce. The parties vol-
untarily agree the terms of a contract and where permitted
by law, the parties may exclude or limit their liabilities for
certain acts or omissions.2 Stated differently, a contractual
relationship is governed by the contract rather than by law
and parties may bring a (civil) action to enforce the terms
of the contract including claims for damages for breach of
the contract. In civil actions such as tort and contract, the
required standard of proof is the ‘balance of probabilities’,
and the claimant must discharge the burden of proofing any
loss, injury or damage. Contracts may, in some cases, be
used to transfer a liability between parties.
1.3.3 Crimes
Unlike civil proceedings that are initiated by private citizens
against other citizens or organizations or the government, it
is the state that initiates criminal proceedings for a breach of
the criminal law. A crime is a wrong against the society, and
any act that constitutes a crime must be so defined by the law
and punishment there for stipulated by the law. The standard
of proof in criminal cases is ‘proof beyond reasonable doubt’,
and the onus of discharging the burden lays on the prosecution
or the state.3 Punishment for crimes can range from minor
fines to lengthy imprisonment. It is important to note that
although there is a development towards ‘corporate criminal
liability’ (the concept that corporations should be held liable
for criminal acts of officials such as directors, managers and
employees), there is no common European approach in this
area and domestic laws vary. Some countries (such as Ger-
many) do not impose criminal liability on corporations, others
rely on administrative sanctions (See notes below).
1.3.4 Administrative actions
Administrative actions are concerned with activities of
administrative agencies to which Authority is delegated based
on the agency’s expertise on the subject matter. Administra-
tive actions, therefore, involve oversight functions through
the enforcement of statutory laws (laws made by parliament)
and rules made by the administrative agencies themselves.
Unlike torts where individuals can bring actions for dam-
age or injury and criminal prosecutions initiated by the state,
administrative actions involve ensuring compliance through
oversight which may entail levying fines and other sanctions
on organizations in breach of the law. To invoke sanctions,
damage or injury need not occur, all that is required is non-
compliance with standards set by the law. For example, data
protection authorities can impose fines for non-compliance
with the principles of data protection even when data has not
been lost or stolen as a result of such non-compliance. Also,
environment protection agencies can impose fines for failure
to report pollution or to take remedial actions.
It is important to note that although liability is discussed
under the separate heads below, in practice, civil and crimi-
nal liability may arise from the same activity and administra-
tive actions may overlap with civil and criminal sanctions
(See notes on data protection below).
1.3.5 Laws, regulations, directives andstandards
The potential for torts and crimes is introduced by legal acts
of national and international parliaments. In the EU, Regula-
tions and Directives are different types of legal acts of the
EU. According to Article 288 of the Treaty on the Function-
ing of the European Union,
A regulation shall have general application. It shall
be binding in its entirety and directly applicable in all
Member States.
A directive shall be binding, as to the result to be
achieved, upon each Member State to which it is
addressed, but shall leave to the national authorities
the choice of form and methods.
2 Exclusion and limiting clauses allow parties to either limit or
exclude liabilities for acts or omissions for which they would ordinar-
ily be liable.
3 Health and safety law is the exception to this rule; the onus is on
the defendant(s) to demonstrate to the satisfaction of the courts that
they have discharged their duties under health and safety law.
Content courtesy of Springer Nature, terms of use apply. Rights reserved.
117AI & SOCIETY (2020) 35:113–134
1 3
Thus, an EU regulation is an immediately binding law
without further actions required, while directives are typi-
cally ‘transposed’ by member states into new local laws
which implement them. In the UK, directives with relevance
to Health and Safety are implemented in the form of regula-
tions under the powers granted by the Health and Safety at
Work Act 1974.
Technical standards are distinct from laws, and their use
is usually voluntary. Standards are defined by technical com-
mittees, including the International Standards Organization
(ISO), the European Committee for Standardization (EN),
American National Standards Institute (ANSI), national com-
mittees such as the British Standards (BS) in the UK, local
industry sector organizations, and sometimes within a single
organization. Reasons for voluntary use include the ability to
provide the customer with a guarantee—via contract law—of
meeting a publicly known and accepted level of quality or
safety; and also, the desire to make use of industry-wide tech-
nical best practices consolidated in a standard. Like direc-
tives, standards are often transposed between regions and
subregions, for example a standard named with “ISO EN
BS” may have begun as an international ISO standard, then
transposed downward via both Europe and UK organizations;
or it may have begun as a UK standard and been transposed
upwards through the EU and ISO.
In some cases, the law may grant special status to a stand-
ard, giving it legal force, such as requiring all manufacturers
to implement it for certain types of product. This is known
as “calling up” the standard.
In the EU, compliance with Product Safety standards
which are published in the Official Journal of the European
Union is assumed to demonstrate compliance with relevant
directives supported by the standards. Some directives will
recommend the creation of such standards to be created
along with their legislation, to aid compliance with that leg-
islation. These are known as ‘harmonised standards’.
Several EU directives require products to obtain a special
“CE mark” standard (Conformite Europeanne) before sale.
The CE mark then allows sale across the European Economic
Area (EEA), showing compliance with all relevant directives.
The relationships between laws, directives, and standards
are illustrated in Fig.3.
2 Regulating agribots: legal framework
In the light of the above, the sections that follow examine
how tortious, contractual and criminal liabilities, and use of
standards could arise in the manufacture, use or operation
of agricultural robots.
2.1 Liability intort
2.1.1 Product defect
EU Directive regulates liability for defective products on
the approximation of Laws, Regulations and Administra-
tive Provisions of the Member States for Defective Prod-
ucts.4 [transposed in the UK as the Consumer Protection Act
(CPA) 1987]. The directive applies to all types of products
including agricultural products. Under the law, “product”
is defined as all movables even if incorporated into another
movable or an immovable (See art 2 of amendment to direc-
tive). A producer means the manufacturer of a finished prod-
uct, the producer of any raw material or the manufacturer
of a component part. A producer also includes any person
who, by putting his trademark or other distinguishing feature
on the product presents himself as its producer (art three
directive).
The Directive lays down the principle of liability without
fault or strict liability which means a person injured by a
defective product can claim damages even if the defect was
not due to the producer/manufacturers negligence. A defec-
tive product is one which does not provide the safety which
a person is entitled to expect, considering, all circumstances
including, the presentation of the product, such as adequacy
of the warning,5 the use to which it could reasonably be
expected that the product would be put and the time when
the product was put into circulation are factors (art 6). The
standard of the defect is, therefore, an objective one. For
example, a product is defective if its safety is not such as
persons generally (everyone and not the particular claimant
Fig. 3 Relationships between laws, directives, and standards. Crea-
tion is shown by solid arrows. Compliance is shown by dashed arrows
4 See Directive 85/374/EEC on the approximation of the laws, regu-
lations and administrative provisions of the Member States concern-
ing liability for defective products (as amended by Directive 1999/34/
ec) The Directive is implemented in the UK by the Consumer Protec-
tion Act (CPA) 1987).
5 See e.g. UK CPA s 3 (2)(a).
Content courtesy of Springer Nature, terms of use apply. Rights reserved.
118 AI & SOCIETY (2020) 35:113–134
1 3
injured by the product) are entitled to expect. Also, the law
does not infer defect from the fact that a better or safer prod-
uct was subsequently put into circulation or permit persons
to expect standards of safety that are unknown or which do
not exist at the relevant time (art 6, 7 Directive, s 3 CPA).
Moreover, to succeed in an action for damages, the claim-
ant or injured person must prove the damage and the defect
in the product as well as the causal relationship between the
damage and the defect (art 4). In other words, the claimant
must prove that he suffered damage, that there was a defect
in the product and that the defect caused the damage. Pre-
sumably, therefore, if a claimant is unable to prove defect, he
cannot prove that loss or damage resulted from such defect.
However, in cases where the causal link is established, the
law also provides for defences which are of particular rel-
evance to the manufacturer of the agribot. As examples, it is
a defence that the producer (or manufacturer) did not put the
product into circulation or that the defect did not exist at the
time the product was put into circulation (art 7). These argu-
ably cover instances where someone caused the fault after
the manufacturer supplied the agribot or where interference
with software causes the agribot to malfunction or where
the agribot has been used for a purpose for which it was not
intended (See notes on dual-use below).
Other grounds for avoiding liability include a claim that the
safety fault was an inevitable result of obeying the law (e.g.,
the agribot could be safer but for provisions of the law which
excludes the use of certain technology). Also, it is a defence
that the manufacturer could not have made the product more
secure or safer given the state of knowledge in science and
technology (‘development risk defence’) (art 7(e)). Therefore,
it is a defence under the law that the state of scientific or tech-
nical knowledge at the relevant time is such that the manufac-
turer could not have known the defect in the product. This sug-
gests that the law does not expect manufacturers or designers
to wait until a safer technology is available before introducing
their products. All that is required is that the standard of safety
corresponds to state of the art in scientific or technological
knowledge at the relevant time. However, the Directive makes
this defence optional, and it would, therefore, only avail the
manufacturer where it is provided for under national law.6
It is important to stress that requirement for proof, and
indeed the definition of a defect under the law is not intended
to undermine consumer protection. Rather, it is intended to
strike a reasonable balance between the obligation to protect
consumers and the need to promote innovation in a fast-
evolving technology environment. For example, while owing
to the complexity, technicality and probabilistic behaviour
of products like an agribot, it may be difficult and expensive
for claimants to prove a defect, it must also be assumed that
developments in artificial intelligence, robotics and machine
learning would mean that safety standards become outdated
fairly quickly. Therefore, unless the law limits the liabil-
ity of manufacturers to safety standards based on the state
of scientific and technical knowledge, their liability could
be indeterminable or infinite, and this may adversely affect
innovation and development.
It is also relevant to note that damage includes damage
caused by death or personal injury and damage or destruc-
tion caused to property other than the defective product itself
(art 9). Liability imposed by the law cannot be excluded or
limited by contract and can be joint and several.7 However,
member states may provide for the limitation of liability for
damage resulting from death or personal injury provided that
the amount shall not be less than 70million ECU (art 5,12).
2.1.2 Accidents andhealth andsafety law
In the UK, health and safety law is implemented through
the provision of the Health And Safety At Work etc Act
(HASAWA) 1974. The Act enables the enforcement body,
the Health and Safety Executive (HSE) to bring criminal
prosecutions under Section33 of the Act against organiza-
tions deemed to have breached the statutory duties it imposes.
The primary duties imposed by the act are described in
Sect.2 and Sect.3 of the Act. The former imposes duties on
employers to ensure the safety and health at work of employ-
ees; the latter on employers (and self-employed persons) to
ensure the safety at work of those persons other than their
employees who could be harmed by the employers’ under-
taking. An undertaking is defined by the set of activities
carried out by an organization; this extends to the design and
manufacture of products such as agribots and includes their
use. Therefore, an accident whereby a member of the public
is injured by an agribot could result in a criminal prosecution
against the owner/user of the agribot, and/or the designer/
manufacturer. The balance of this prosecution depending
mainly on the nature of the accident.
Section6 of HASAWA1974 imposes duties on the manu-
facturers etc. (including designers) for the safety of articles
used at work. Therefore, prosecutions could hypothetically
also be initiated for a breach of this Section; however, in
reality, this is seldom the case.8 Further, the duties of design-
ers and manufacturers of agribots are better described under
6 See arts 7 and 15(1)(b) Directive, See also Sect.4(1)(e) of the UK
CPA which infact allows this defence.
7 Joint and several liability means the person injured by a defective
product can sue multiple parties and recover full damages from one
and/or all of them.
8 HSE public register of convictions indicates that this is around
thirty (30) successful prosecutions under Sect.6 in 10years.
Content courtesy of Springer Nature, terms of use apply. Rights reserved.
119AI & SOCIETY (2020) 35:113–134
1 3
the Consumer Protection Act 1987, and / or the Supply of
Machinery (Safety) Regulations 2008.
Prosecutions for breach of duties under Sections2–6 of
HASAWA1974, if elevated to the Crown Court, invoke a
potential maximum penalty of two (2) years imprisonment,
and / or an unlimited fine. In all cases described above, the
duty is qualified and limited by the term ‘so far as is reason-
ably practicable’ (SFAIRP). This is also commonly phrased
as the duty to reduce risk to a level that is ‘As Low as Rea-
sonably Practicable’ (ALARP). These terms are largely
interchangeable, the former used in legislation, the latter
commonly used in engineering communities.
The key element is the concept of reasonable practicabil-
ity. This was defined in common law decades before9 the
implementation of HASAWA1974 and provides a funda-
mental means to both limits the duty imposed by the Act
and mitigate the liability incurred following an accident
and resultant prosecution. If the defendant(s) can demon-
strate that all reasonably practicable measures were taken
to reduce the risk, they thereby demonstrate that they fully
discharged their duties under HASAWA1974.
Demonstration that all reasonably practicable measures
have been taken (often termed ‘demonstration of ALARP’)
requires the following measures be taken10:
1) Identification of reasonably foreseeable hazards and
assessment of risk;
2) Adoption of authoritative good practice for control of
risk;
3) Identification of further practicable risk reduction meas-
ures;
4) Implementation of identified risk reduction measures
unless it can be demonstrated that the sacrifice (cost,
time, effort) associated with doing so is grossly dispro-
portionate to the safety benefit gained from the measure.
The above steps (2)–(4) are further predicated on the
assumption that the overall risk to the safety and health
of persons affected by the activity/product/system under
assessment is in general, tolerable. If the risk is assessed as
intolerable, then the owner of the duty to reduce that risk
must do so regardless of any consideration of sacrifice. HSE
guidance R2P2 provides a quantitative baseline definition of
intolerable and tolerable risk.11
Where risks are well understood and defined by an industry
body of knowledge, completion of steps (1) and (2) above will
be sufficient to demonstrate ALARP. This can include com-
pliance with legislation, approved codes of practice (ACOP)
and in some cases engineering standards, where these can be
shown to be directly and fully applicable and correctly applied.
Where such compliance is not possible, for example,
because the technology associated with activity/product/
system is new or novel, or because it is not possible to fully
comply with relevant standards, further effort will need to
be expended on risk assessment and/or engineering study,
to determine what can be practically done to reduce the risk.
Demonstration of gross disproportion relies upon the
assessment of the benefit of the risk reduction measure and
consideration of the sacrifice (e.g., financial cost) of imple-
mentation of the measure. The concept of gross disproportion
ensures that this is not a straightforward cost–benefit analysis,
whereby the owner could demur if the sacrifice simply exceeds
the benefit; rather the sacrifice must grossly exceed the benefit
before the duty to implement the measure is discharged.
The above assessment can often be carried out quali-
tatively, for example, through use of a continuous matrix
(such as the Boston Square), placing the effectiveness of a
risk reduction measure on one axis, and difficulty involved
in implementing the measure on the other axis. Potential
improvement measures are then ranked relatively against
each other. There are also a number of simplified screen-
ing tools in general use that highlight qualitatively those
measures that should be implemented, should not be imple-
mented, and those which require further study. In all cases,
these qualitative methods will need to take account of the
requirement to demonstrate gross disproportionality between
the sacrifice and the safety benefit.
Where sufficient information is available, and where
the resolution of the cost/benefit decision is less clear (for
example, an initial screening tool results in the require-
ment for further study), a full quantitative assessment can
be undertaken. This requires the quantification of the full
lifecycle risk without further mitigation (sometimes termed
the vanilla risk), for example, in terms of Potential Loss of
Life (PLL) or Fatalities and Weighted Injuries Rate (FWI);
similar quantification of the risk reduction measure(s); and
combination of these values with a Value for Preventing a
Fatality (VPF).12 The sacrifice associated with implementing
these measures is then calculated, and the measure imple-
mented unless the sacrifice is found to be grossly dispropor-
tionate to the safety benefit.
9 Judgement of Lord Asquith in Edwards vs National Coal Board
1949.
10 Health and Safety Executive. Reducing Risks Protecting People
(R2P2).
11 1 × 10− 3 fatalities per annum for workers, 1 × 10− 4 fatalities per
annum for members of the public.
12 R2P2 provides a value of £1,000,000 in 2001, however, this value,
when subject to a reasonable allowance for inflation, should be con-
sidered a minimum value. Various higher values have been applied in
different industries.
Content courtesy of Springer Nature, terms of use apply. Rights reserved.
120 AI & SOCIETY (2020) 35:113–134
1 3
Definitions of gross disproportion vary dependent on
context; however, a useful rule of thumb is to consider the
initial level of risk. Where that initial risk is tolerable but
high, i.e., close to the border with the intolerable region, the
gross disproportion factor should be similarly high. Where
the risk is tolerable but low, the gross disproportion fac-
tor may also be lower. In some industries, in some circum-
stances, a sacrifice that is 3 × the benefit may be considered
grossly disproportionate; whereas in other cases, a factor of
10 × may be required before a measure should be considered
not reasonably practicable to implement.
The Management of Health and Safety at Work Regula-
tions 1999 impose a duty on employers to undertake a suitable
and sufficient risk assessment in support of the duties placed
upon them by Sections2 and 3 of HASAWA1974. However,
even were this not the case, a demonstration that risk has been
reduced ALARP is challenging to achieve without carrying
out such an assessment. In fact, the requirement for risk assess-
ment has arguably been part of UK common law since 1949.13
The requirement for risk assessment, should not be con-
fused with a requirement for risk analysis. For a risk assess-
ment to be suitable and sufficient, it must demonstrate that
appropriate action has been taken to reduce the risk. Where
sufficient information is available, a detailed analysis in sup-
port of this action may be beneficial. However, this is often
not required, and sometimes not justifiable. For example,
where there are high levels of uncertainty associated with a
particular hazard, which render conventional risk assessment
techniques unreliable, a precautionary principle14 should
be adopted. This principle requires that the assessment and
action were taken to be based more on the putative conse-
quences of a risk, rather than the likelihood.
In the case of agribot use/design/manufacture, where
authoritative good practice is still primarily to be defined,
compliance with health and safety law will depend on the
suitability and sufficiency of the risk assessments carried out
by duty holders. Further, whereas the balance of prosecu-
tions in the UK as a whole tends to focus more on immediate
causation15 (i.e., who are the persons/organizations who ‘last
touched the risk’), the nature of the autonomous robots may
largely necessitate a greater focus on the prosecutions of
designers and manufacturers. They may be more frequently
called upon to present formal safety justifications of their
autonomous products that demonstrate anterior identifica-
tion, consideration and management of relevant hazards and
risks. Complete justification will necessarily include the
documentation of critical design decisions, the identified
practicable risk reduction measures, and reasonable justifi-
cations for the measures rejected, as well as those, adopted.
For users/owners to discharge their safety and health
duties, they may be largely dependent on the decisions are
taken autonomously by the agribots. In corollary, the extent
to which they can be held liable for those autonomous deci-
sions is limited by the extent to which they can train/teach
the agribot before full operations; this is in turn limited by
the safeguards and risk reduction measures defined by the
designer as a result of their risk assessment. As with all
risk reduction measures, a hierarchy of control16 should be
adopted by designers.
Elimination of hazards during the early phases of design
should be prioritized; where hazards cannot be eliminated
they should be controlled primarily be engineering means,
for example, safety functions17 that bring the agribot into
a safe state upon detection of a failure or the presence of
a member of the public in close proximity. Lower levels
of this hierarchy will necessarily include the provision of
instructions for use, informed by the suitable and sufficient
designer risk assessment. In effect, the users will be respon-
sible for management of the residual risk associated with
the agribot, i.e., those risks which could not be designed
and engineered away.
Notwithstanding the above, there is guidance avail-
able that will be partly applicable to the use of agribots
and may assist users of agribots with the implementation
of safe systems of work. This will necessarily include
appropriate traffic management arrangements18, includ-
ing measures to ensure exclusion of the public, route plan-
ning, lighting and visibility, where necessary, as guided by
15 Review of prosecutions for 2016/17 under the CDM Regulations
2015 describes a total of seven (7) potential breaches of duties for
Principal Designer / Designer duties, whereas a total of ninety-nine
(99) potential breaches of client duties, four-hundred and eighty-nine
(489) potential breaches of Principal Contactor duties, and two-hun-
dred and seventy-eight (278) potential breaches of Contractor duties
were identified.
16 Several different hierarchies are available, for example, the com-
monly used ERIC PD (Elimination, Reduction, Isolation, Control,
Procedures, Discipline) and the hierarchy provided in the Provision
and Use of Work Equipment Regulations (PUWER) 1999, where
fixed guards shall be provided to prevent exposure to dangerous parts
of machinery, wherever practicable, and where not so, the provision
of other guards or protection devices. Information, instruction, train-
ing and supervision are in all cases the lowest level of the hierarchy
for the control of identified risks.
17 Safety functions designed in accordance with BS EN IEC 61,508
and BS EN IEC 62,061.
18 For example, INDG199 HSE leaflet on Workplace Transport
Safety and HSG136 HSE guidance on Workplace Transport Safety,
both of which are freely available electronically from the HSE web-
site.
13 Edwards vs National Coal Board 1949; ‘Moreover this computa-
tion falls to be made by the owner at a point of time anterior to the
accident.’
14 R2P2 Reducing Risks Protecting People.
Content courtesy of Springer Nature, terms of use apply. Rights reserved.
121AI & SOCIETY (2020) 35:113–134
1 3
manufacturer-provided instructions for use in combination
with suitable and sufficient user risk assessment.
Health and Safety law in the UK is primarily goal-setting
and requires a regime of self-regulation to ensure compli-
ance with the HASAWA1974, particularly Sections2 and 3.
Therefore, the measures, guidance, and techniques outlined
above are applicable, regardless of whether any specific,
prescriptive regulation exists. In all cases, applicable good
practice should be sought, and the duty owner(s) should
determine appropriate measures to reduce the risk to a
demonstrably ALARP level using an appropriate hierarchy
of risk control measures.
For example, in the event that an agribot may be used in
low-visibility environments, such as mist/fog, or nighttime
working, the designers would need to consider the measures
that could be designed into the system to reduce the risk.
For example, a designer could not demonstrate that risk had
been reduced ALARP by recommending in the instructions
for use that the agribot wear hi-visibility clothing, regardless
of how humanoid in appearance the agribot may be! Firstly,
this is because Personal Protective Equipment (e.g., hi-vis
jackets) always forms the lower ranks of any hierarchy of
risk control measures; correct use of PPE is always subject
to human error or violation. Secondly, hi-visibility cloth-
ing is used primarily to protect the wearer, whereas in this
scenario, persons most at risk would likely be those driving
other vehicles that could potentially impact the agribot. It
should be clear to a designer that, even in the event of a
hypothetical stipulation in the instructions for use that the
agribot should not be used in periods of low visibility or
nighttime; use in such conditions would certainly consti-
tute reasonably foreseeable misuse. As such, the designer
has an obligation to ensure that the agribot is provided
with reasonably practicable measures to increase visibility
(e.g., lights) and / or other measures to avoid collision (e.g.,
horns / audible warnings). For example, practicable meas-
ures could include (but not limited to): collision detection
systems based on radar scanning and autonomous avoid-
ance; built-in lighting systems, potentially with safety sys-
tems that prevent operation in low-visibility environments
when lighting systems are non-functional; hi-visibility paint-
work; reflective strips, reflectors. A combination of these
elements would likely be necessary to demonstrate that risk
is reduced ALARP, subject to assessment as described in
the paragraphs above.
A further example, is the use of agribots on public high-
ways. From the above discussion and example, it should
be clear that no agribot should be used on public highways
unless reasonably practicable risk reduction measures are
implemented. Inherent in the definition of reasonable prac-
ticability is the concept of proportionality; measures taken to
reduce the risk should be proportional to the risk. Therefore,
in the event that an agribot is required to autonomously travel
on or across public roads then collision avoidance safety sys-
tems must be designed-in, similar in extent to those required
for autonomous road vehicles. However, in the event that an
agribot can be supervised across a road crossing in manual
mode or remote mode the exposure to risk is lower, and it is
reasonable for the designed-in safeguards to be less onerous
(of course providing that suitable controls are designed-in to
prevent inadvertent agribot access to public roads).
For the scenario of an agribot crossing a road in a super-
vised/manual/remote mode, the extent to which risk reduc-
tion measures can be designed-in would be firstly dependent
on the extent to which risk reduction measures are practica-
ble, i.e., technically feasible. For example, crashworthiness/
impact absorption, to prevent damage in passenger carrying
vehicles, and/or collision avoidance systems that effectively
distinguish between vehicle hazards, users, members of the
public, and livestock (which may be crossing simultaneously
with the agribot). Secondly, the designers would need to be
assured that they are not introducing additional hazards that
are potentially higher risk than the hazard they are trying to
control. For example, designer risk assessment may deter-
mine that any collision detection system should be deac-
tivated, while in manual or remote mode to avoid risks to
the local user—such as autonomous avoidance resulting in
the robot reversing into a manual remote controller walking
closely behind it—or risks increased by non-execution or
delays to command responses. In this case, the system would
not be effective for mitigating risk of vehicle impact when
crossing roads. In such a scenario, complete with supporting
risk assessment, it may be that the designer is able to rea-
sonably discharge their responsibility for further reduction
of risk. This is providing that: suitable arrangements are
provided in design for agribot visibility as discussed above;
the manual/remote mode is generally and demonstrably safe
and reliable; a Safe System of Work can be adopted by the
user that follows the highway code, providing suitable warn-
ing to other road users that a crossing is taking place, and
controlling/excluding traffic, where necessary.
2.1.3 Accidents andnegligence
Legal action in tort for negligence may also be taken against
manufacturers, agricultural contractors, operators and farm-
ers and their agents for injuries, loss or damages resulting
from negligence or accidents involving the agribot. How-
ever, unlike strict liability or liability without fault, a claim
in negligence requires the claimant to prove fault on the part
of the manufacturer or other person being sued. The follow-
ing must be established; the defendant(s) (such as manu-
facturer/designer, contractor or farm owner) owes a duty of
care to the claimant, there was a breach of that duty (the
defendant failed to take care), the claimant was harmed (that
is personal injury, or damage or loss of property resulted).
Content courtesy of Springer Nature, terms of use apply. Rights reserved.
122 AI & SOCIETY (2020) 35:113–134
1 3
Liability for negligence may fall on any of the parties
depending on the cause of the accident, and who owes or is
owed a duty of care in the circumstances of each case. For
example, the position of the law is that the manufacturers
owe a duty of care to persons who use their products and
manufacturers would be deemed to have breached this duty
where there is a defect in the product. A cause of action (the
basis for suing the manufacturer) arises where injury or loss
results from the defect. For liability, it is immaterial that the
claimants did not purchase the product themselves. There-
fore, suppliers, farmers, contractors and their agents or other
users who may be injured by any defect in the agribot would
be entitled to sue the manufacturer for negligence. From
the perspective of the consumer, an action in negligence
provides additional protection as product defect may raise a
prima facie case of negligence.19
Apart from defects, liability for negligence may arise in
cases of misuse mainly where manufacturers fail to provide
instructions or where the instructions are inadequate or mis-
understood. Under the EU Machine Directive,20 [transposed
in the UK as the Supply of Machinery (Safety) Regulations
2008] the manufacturer or his authorized representative is
required to provide necessary information such as instruc-
tions before putting machinery on the market and/or put-
ting it into service (art 5 Machine Directive). Regarding the
general principles for drafting instructions, the Directive
provides that instructions must be drafted in one or more
official Community languages (of the EU), and the case of
machinery intended for use by non-professional operators,
the wording and layout of the instructions for use must take
into account the level of general education and acumen that
can reasonably be expected from such operators (Machinery
Directive item 1.7 annex 1).
It is, therefore, a question of fact depending on the cir-
cumstances of a case whether warning or instruction is suf-
ficient and whether the manufacturer is liable or not. For
example, instructions and warnings full of probabilities and
equations provided to intermediaries (such as agricultural
contractors) may be sufficient if the contractor is learned in
and has a good understanding of the agribot. Conversely,
the same instruction addressed to farmers who (presumably)
have the less technical knowledge, may need to be more
basic. Therefore, in a hypothetical scenario where a farmer
misunderstands the instructions and assumes the agribot is
safer than it actually is and thereby causes the agribot to
malfunction and kill a walker, a brochure full of probabili-
ties may be deemed too complicated, and the manufacturer
may be held liable for accident caused by the farmer’s mis-
use. The key principle is, therefore, that instructions must
be pitched at the level at which both technical other non-
technical users of the agribot can understand them.
Other provisions of the Machinery Directive particu-
larly relevant to the agribot include the requirement that the
contents of the instructions must cover both intended use
of the machinery and any reasonably foreseeable misuse.
Also, where applicable, the instruction manual must contain
warnings concerning ways in which the machinery must not
be used that experience has shown might occur (item 1.7.4
annex 1 to the Machinery Directive). These provisions sug-
gest the manufacturers would still be deemed to have com-
plied with the law if they fail to give warnings on use and
misuse which were not known at the time of manufacture or
design but subsequently becomes known due to self-learn-
ing, the processing artificial intelligence (AI) or repurposing
of the robot. They also suggest that apart from the manufac-
turer, other users of the agribot could be liable if they ignore
clear instructions and warnings or continue to use the agribot
after discovering that it has malfunctioned due to failure to
follow instructions. However, to benefit from the presump-
tion of conformity with the health and safety requirements
under the Directive, manufacturers are required to affix CE
marking on their product and comply with a declaration of
conformity (arts 5,7).
2.1.4 Accidents caused byagents, employees
andcontractors
Liability for accidents caused by third parties depends on
whether the person who caused the accident is an agent or an
independent contractor. Under the law, a principal is vicari-
ously liable for the acts and omissions of his agent when the
agent is acting within the scope of his authority. The scope
of an agent’s authority is defined by a contract between the
agent and the principal. As an example, therefore, liabil-
ity for acts or omissions of the operator of the agribot will
depend on whether he is an agent of the manufacturer or
the agricultural contractor or whether he is an independent
contractor. Similarly, if there is a franchise agreement, the
franchisor’s liability will depend on whether the franchiSee
acts in the capacity of an agent. Therefore, while the law
does not automatically infer an agency relationship from a
franchise, the agency can be inferred from the contract and
the circumstances of the case.
As also noted above, liability might depend on whether
third parties, such as employees, agents or contractors,
receive adequate instructions on the use of the product. As
an example, under the Provision and Use of Work Equip-
ment Regulations (PUWER) 1998 (UK), businesses which
either use or hire out work equipment are required to manage
the risks from the equipment. Risk management includes
19 This means fact of defect is sufficient to raise a presumption of
negligence unless it is disproved.
20 Directive 2006/42/EC came into effect on 29 December 2009 and
replaced Directive 98/37/EC.
Content courtesy of Springer Nature, terms of use apply. Rights reserved.
123AI & SOCIETY (2020) 35:113–134
1 3
ensuring that all people who use or manage work equip-
ment receive adequate instructions and appropriate train-
ing. Therefore, apart from the manufacturer, operators of the
agribot, agricultural contractors and farmers are also legally
obliged to assume liability for accidents caused by third par-
ties due to misuse.
Furthermore, under the Occupiers Liability Act 1957 and
1984, an occupier, that is a person in control of land, prem-
ises or buildings can be held liable for injury or harm to
another person on the land. Such persons can include work-
men, residents, visitors, strangers or even trespassers. One of
the conditions for the assumption of liability is that the harm
is caused by a person over whom the occupier has control or
over which he could exercise some degree of control. It is,
however, important to note that this liability can be excluded
by contract.
Finally, damage caused by the escape of things likely to
cause mischief is borne by the owner of the land provided
the damage be NIL a reasonably foreseeable consequence
of the escape (this is the rule in Rylands v Fletcher).21 In
practice, this might mean a farm owner or farm manager
could be liable if he (or his agent or anyone under his con-
trol) allows the agribot or things used by the agribot such as
herbicides to ‘escape’ to adjourning lands or farms and for
damages resulting from such escape. This position poses a
little problem when the agribot is operated in manual mode
as the operator is deemed to be in control. However, when
operating autonomously, the risk of ‘escape’ may heighten,
and farmers or other users of the agribot may have to adopt
additional measures to avoid liability. This may include clos-
ing escape routes and putting warning signs at different ends
of a road when the agribot is in operation. Although this is
not a legal requirement, in the UK, farmers routinely close
local roads to move herds of animals by placing signs and/
or people at both ends before releasing the animals. The
Health and Safety Executive (HSE) has also issued a num-
ber of advice on public access for livestock which would be
relevant to the operations of the agribot.22 It is, however,
important to note that the Animals Act 1971(UK) impose
strict liability on keepers of animals which are of a danger-
ous species.23
The outstanding challenge from the above liability alloca-
tion regimes relates to how to resolve the attribution prob-
lem. For example, despite the clear provisions of the law,
it might be difficult to ascertain whether damage, injury or
loss was caused by a defect in the product or misuse such as
failure to follow instructions. It is conceivable for instance
that contractors or farmers would tend to attribute loss or
damage to product defect rather their misuse of the agribot.
It is also conceivable considering the complex and technical
nature of the agribot and the fact that law imposes liability
on the manufacturer for insufficient and unclear instructions,
that courts might be more inclined to hold manufacturers
liable in negligence rather than hold users liable for mis-
use. One solution to the possible dilemma is to design the
robot with detailed data logging system. This would create
a form of ‘liability by design’ which enables the agribot to
keep detailed logs of events and incidents including possibly
replaying an accident to establish if was caused by a sensor
failure or user command. A data logging system may, there-
fore, assist in identifying where liability falls where there is
a dispute as to whether accidents are due to manufacturer
defect or user misuse.
2.2 Administrative actions
2.2.1 Regulation ofenvironmental damage anduse
ofchemicals ingeneral
The application of chemicals which may affect the environ-
ment is tightly controlled. This includes the robotic applica-
tion of fertiliser and pesticide chemicals, and their potential
effects on the human food chain, water supply, neighbour-
ing farms, farm staff, and the more comprehensive public
environment.
Liability for damage to the environment by activities of
businesses is regulated by Directive 2004/35/EC of the EU
Council on Environmental Liability regarding Prevention
and Remedying of Environmental Damage [transposed in
the UK as Environmental Damage (Prevention and Reme-
diation) (England) Regulations 2015]. The Directive adopts
an administrative approach. It does not, therefore, apply to
cases of personal injury, damage to property or economic
loss and does not affect any right regarding these types of
damages (recital 14).
The relevant provisions of the law impose strict liabil-
ity (based on a ‘polluter pays principle’) for pollution of
the environment caused by certain activities including the
manufacture, use, storage, processing, filling, release into the
environment and onsite transport of plant protection prod-
uct. Plant protection products include products for destroy-
ing undesired plants and damage includes damage to water
and soil. Although liability is strict, a causal link between
the activity and the damage must be proved, and the law
allows cost allocation in cases of multiple party causation
especially concerning the apportionment of liability between
the producer and the user of a product (art 9). Where envi-
ronmental damage has occurred, the operator is required to
inform the competent authority and take practical remedial
21 [1868] UKHL 1.
22 See e.g. HSE, ‘Cattle and Public Access in England and Wales:
Advice for Farmers, Landowners and Livestock Keepers’ http://www.
hse.gov.uk/pubns /ais17 ew.pdf accessed 03/05/2017.
23 Animals Act 1971, s 1.
Content courtesy of Springer Nature, terms of use apply. Rights reserved.
124 AI & SOCIETY (2020) 35:113–134
1 3
actions to remove or otherwise manage the contaminant (art
6). The operator bears the costs for the preventive and reme-
dial actions taken under the Directive (art 8).
Effects on water supplies are controlled by Directive
2000/60/EC (Water Framework); Directive 2008/105/EC;
and Council Directive 98/83/EC (Drinking Water Directive),
which set limits on levels of chemicals which may enter pub-
lic water systems. Restrictions on the classification, label-
ling, and packaging of substances and mixtures are defined
in European Regulation (EC) No 1272/2008.
2.2.2 Use offertilisers
Apart from general chemical laws, fertilisers are covered by
addition laws.
The EU Nitrate Directive 91/676/EC aims to protect
water quality across Europe by preventing nitrates from agri-
cultural sources polluting ground and surface waters and
by promoting the use of good farming practices. The 2003
European Fertilisers Directive covers Sale, manufacture and
labelling of fertilisers. The Directive will apply to the sale of
fertilisers, which may include the sale of fertilisers included
as part of a robotic package.
Ammonium nitrate fertiliser may be used as an ingredi-
ent of explosives, so falls under anti-terrorism laws which
control its storage security. In the UK these include Control
of Major Accident Hazards Regulations (COMAH); Danger-
ous Substances (Notification and Marking of Sites) Regula-
tions 1990; Ammonium Nitrate Materials (High Nitrogen
Content) Regulations; and Planning (Hazardous Substances)
Regulations.
The EU Single Payment Scheme subsidises farms but in
return imposes environmental protection requirements on
them which may include limits of fertiliser levels. Further,
Nitrate Vulnerable Zones (NVZs) are areas designated as
being at risk from agricultural nitrate pollution (e.g., includ-
ing about 60% of land in England.) There are additional legal
limits on amounts and times of year for fertilisers which
can be used in them, imposed by the Nitrates Directive and
Drinking Water Directive.
2.2.3 Use ofherbicides, pesticides andbiocides
In addition to general chemical laws, pesticides—and more
generally, “biocides”—are covered by additional laws. A
“herbicide” is a chemical which kills one or more plant
types; a “pesticide” is a chemical which kills “pests” includ-
ing weeds, fungi and insects; a “biocide” is a chemical which
harms any animals, humans or the environment.
The 2009/128/EC Directive on Sustainable Use of Pesti-
cides [implemented in the UK as “PA Certificates of Compe-
tence” via the transposed Plant Protection Products (Sustain-
able Use) Regulations 2012] aims to protect surface water
and drinking water from pesticide contamination. Also,
pesticide use is to be reduced in areas used by the general
public and in nature conservation areas. It aims to reduce the
risks and impacts of pesticide use on human health and the
environment and promote the use of integrated pest man-
agement and alternative approaches, such as non-chemical
ones. The directive requires operator training for different
pesticide types and applicator types.24 It also bans aerial
spraying in all forms, including by autonomous drones and
manual piloted helicopters. In practice, this aerial ban has
proved to be problematic for several weed types, including
needle blight in trees, and bracken in moorland. However,
the directive also allows member states to grant exemptions,
on the application, to users for specific nationally approved
plan types such as these, which are usually administered by
their environmental agencies (For example, the UK currently
has around three such approved plans, used under permits
issued to tens or hundreds of individuals).
The EU Biocides Regulation 528/2012 regulates all sub-
stances harmful to humans, animals and/or the environment,
i.e., biocides, requiring authorisation for their use. Bulk
authorisation is provided to users of “on-label” products,
where the substance manufacturer has handled safety testing
and defines the appropriate dose size and use-case for appli-
cation, on a product “label”, 25 and the user operates within
these parameters. When using “on the label”, liability for
damages caused by the product is transferred from the user
to the manufacturer. If a user chooses to use the product at a
different dose or for a different use-case, this is “off-label”
usage, and the user retains the liability. To comply with the
Biocides Regulation, the user must thus obtain their off-label
authorisation, e.g., via an application for a permit from their
national Environment Agency.
The certification system for human operators appears to
pose little problem to the robotic application where the agri-
bot is legally considered as a tool of a named human opera-
tor and uses an existing applicator type, such as a knapsack
or bulk sprayer system. In this case, that human operator
must hold the required certifications for the herbicide and
applicator type. Definitions of applicator type may become
problematic for robots using novel applicators, such as pro-
totype per-plant precision devices. If the robot operates oth-
erwise than as a tool for example under a framework which
recognises the legal personhood of autonomous robots (See
Legal Personhood section below), then the definition of cer-
tification again becomes problematic.
25 Usually a long and highly detailed legal document, not a physical
label on a chemical container.
24 Until 2015, a “grandfathering” scheme allows existing operators to
practice without certification, this is no longer the case.
Content courtesy of Springer Nature, terms of use apply. Rights reserved.
125AI & SOCIETY (2020) 35:113–134
1 3
As with operator certification, definitions of on-label
application use-cases are likely to function for agribots
spraying using similar technology to a manual knapsack or
tractor-mounted devices, under legal operation as human
tools; but the use of novel applicator methods or non-tool
operation are likely to be problematic or at least require cus-
tom national environment agency licensing.
For manufacturers and sellers of herbicides, additional
rules are provided in the Machine Directive amendment
2009/127/EC on Herbicide Application (transposed in the
UK as EC Fertiliser (England and Wales) Regulations 2006;
See also the UK Fertilisers Regulations 1990/1991 UK). As
with fertilisers, these may apply to agribot operators selling
herbicide as part of a robot product or service package.
2.2.4 Radio communications—scarce spectra
An often-overlooked aspect of agricultural robotics systems
is the need for long-range communications links from the
robot in the field to a base station, which in some cases form
systems as or more complicated than the robots themselves.
Such communications links, as illustrated in Fig.4, are
required if the robot is operating as a tool rather than as a
legal person—so that the named human operator can moni-
tor its condition sufficiently to intervene in emergencies and
to take responsibility for its actions. In practice, this will
often require a video link to monitor the robot’s cameras in
real time. The video is a bandwidth-hungry medium which
often requires specialist communications links and equip-
ment. Radio bandwidth is a limited and valuable26 resource
which must be shared with other local users, so is tightly
regulated in most countries. Hence the legal need for the
human operator to take responsibility for the robot’s actions
must be balanced against the need for legally restricted spec-
trum resources. In the EU, the restriction is performed by
the Radio Equipment Directive 2014/53/EU, and currently
in the UK by the Communications Act 2003.
Most current radio communications operate on single, or
small groups of, identifiable frequencies. Radio communi-
cations have been used from 3Hz to 300GHz, with bands
around higher frequencies carrying more bandwidth than
those at lower frequencies, but lower frequencies propagat-
ing over long distances more efficiently. Two users transmit-
ting on the same frequency in the same area will interfere
with each other’s signals. Countries’ laws initially assign the
rights to transmit on all frequencies to a government body
called the regulator (In the UK this is Office of Communi-
cations OFCOM; in the USA, the Federal Communications
Commission, FCC). The regulator is then responsible for
managing allocations of these frequencies in local areas to
users.
International standards exist, via the International Tel-
ecommunications Union (ITU), designating certain fre-
quency bands for particular types of use, including for
national broadcasting, cell phone data, emergency services
and military communications, and amateur (‘ham’) radio.
The same standards assign further bands for licensed com-
mercial use, and others for unlicensed public use within the
defined power and use-type limitations. This allows products
to operate in the same bands between countries. The regula-
tor typically implements these standards via its licensing to
users.
Public channels. Domestic ‘WiFi’ (802.11) radio is often
used for research agricultural robot communications, requir-
ing no special permission from the regulator. In the UK,
OFCOM allows transmission of data on several frequen-
cies around 2.4GHz for this purpose but limits transmitter
power to 100mW, which typically can stream video up to
around 100m ranges. Many domestic (e.g., up to 250mW)
and other devices (e.g., many watts) are technically able to
transmit at higher powers (achieving longer ranges), but this
would violate the OFCOM regulation.27 Specialist anten-
nas can concentrate the beam transmitted in specified direc-
tions to enable long-range point-to-point communications.
However, OFCOM power regulations apply to the power
level receivable at any location rather than to the source
transmission power. This means that no legal range exten-
sion benefit is obtainable through their use—a 100mW
source concentrated along a beam to a destination may have
the same, illegal, received power as a 1W omni-directional
source. Across the EU, a public 433MHz band may also
be used for low power, short-range communications, suit-
able for sending control commands and occasional sensor
data, but not live video. Across the world, some bands are
allocated for public amateur (“ham”) radio, by the regulator
transferring use to hobbyist organizations, who then allow
their certified members to use them, under restrictions such
as preventing purely private use (such as encrypted or closed
protocol data).
Where public radio channels are insufficient, the regulator
may lease other dedicated bands in local areas for exclusive
use by specified users, usually for a significant fee. Wider
bands cost more but allow higher data rates (In the UK,
OFCOM’s main schemes are called “technically assigned”
and “area defined” licences).
26 The high values have been most visibly demonstrated in many
countries’ recent auctions of spectra to mobile phone companies.
27 From a safety perspective, it should also be noted that 2.4GHz is
a microwave frequency, similar to those used in microwave ovens,
which operate at hundreds of watts for cooking. Multi-watt wifi trans-
mission may be harmful to human tissue as well as illegal.
Content courtesy of Springer Nature, terms of use apply. Rights reserved.
126 AI & SOCIETY (2020) 35:113–134
1 3
2.2.5 Privacy anddata protection
Perhaps one of the most significant aspects of the liability
regime for the agribot is the privacy and data protection
implications of the information collected during its opera-
tions. Data collection agribots tasks may include monitoring
of soil and plant conditions, as well as building up maps of
farms for general navigation. This data can be commercially
valuable not just to the landowner but to others who might
have financial interests in the land (such as deciding whether
to buy it) or in collating data from millions of farms to per-
form the large-scale analysis. Many small farms are owned
and operated as Sole Traderships rather than as limited
company structures, linking their data directly to a named
individual, sole trader and thus making it “personal” data.
According to the European Parliament on Legal Affairs,
for example, AI and robotics can potentially generate large
amounts of personal data that can be used as currency to
purchase services.28 The relevant law is the EU General Data
Protection Regulation (GDPR) 2016 which repealed Direc-
tive 96/46/EC (data protection directive). The Regulation
entered into force on 24 May 2016 and will apply from 25
May 2018.29 The Regulation applies to the processing of
“personal data” defined as information relating to an identi-
fied or natural person (data subject). While much of the pro-
visions centre on bridging perceived gaps in the law given
developments in information technology, provisions relating
to principles of data processing, privacy by design and auto-
mated decision-making is particularly relevant to agribots.
(a) Principles of data protection
The principles relating to the processing of personal data
as follows;
1. Lawfulness of processing—The Regulations provide
that processing of personal data must be Fair, lawful
and transparent. Consent of the data subject is one of
the conditions for lawful processing (art 6). Moreover,
where processing is based on consent, ‘the controller’30
shall be able to demonstrate that the data subject has
consented to the processing of his or her personal data
and the data subject shall have the right to withdraw his
or her consent at any time (art 7).
It is important to note that although consent is not the only
mechanism for justifying the processing of personal data,
it remains a core principle of data processing. Therefore,
where consent is the basis of processing, it must be clear and
unambiguous as the consent of the data subject cannot be
inferred from conduct or inaction. In the case of the agribot,
there may be instances where it is unclear whether the data
is personal or who owns the data for consent. For example,
Company C operates the robot on farmer X’s land and col-
lects detailed soil nutrient information during the run. C then
operates on neighbouring (and competing) farmer Y’s land
and makes use of X’s data to optimise the run on Y’s land
with the result that Y ends up with better-informed run than
X. Farmer Y might also be interested in buying land from
farmer X and could obtain private information about its con-
dition and value from the data. The collection is without X’s
consent. The question may arise whether Company C owes
any obligation to X concerning the collection and use of the
detailed soil information. On the one hand, because detailed
soil information relates to the soil condition and not to the
individual, it cannot constitute personal data. On the other
hand, because the collection may invariably involve the col-
lection of geolocation data, (which is deemed personal data),
Company C may require consent from X. It, therefore, seems
reasonable to obtain consent to any collection of personal
data where it would be difficult to isolate personal data from
the information collected.
2. Purpose limitation—a collection of personal data must
be for specified, explicit and legitimate purposes and
further processing in a manner that is incompatible with
those purposes is prohibited.
3. Data minimisation—personal data must be adequate,
relevant and limited to what is necessary in relation to
the purposes for which they are processed.
4. Accuracy—personal data must be accurate and where
necessary, kept up to date, and every reasonable step
must be taken to ensure that personal data that are inac-
curate, having regard to the purposes for which they are
processed, are erased or rectified without delay.
5. Storage limitation—personal data must be kept in the
form which permits identification of the data subject for
no longer than is necessary for the purposes for which
the personal data are processed. However, data may be
stored for longer periods if it will be processed solely
for archiving in the public interest, scientific or historical
research purposes or statistical purposes.
28 European Parliament, (2014–2019) Committee on Legal Affairs,
‘Draft Report with Recommendations to the Commission on Civil
Law Rules on Robotics’ 2015/2103 (INL), p 8 (hereinafter Commit-
tee on Legal Affairs Draft Rules on Robotics).
29 See Regulation EU 2016/679 on the protection of Natural persons
with regard to the processing of personal data and the free movement
of such data, and repealing Directive 95/46/EC. The GDPR replaces
Directive 96/46/EC on the protection of individuals with regard to the
processing of personal data and on the free movement of such data.
30 The controller is a natural or legal person, public authority, agency
or other body which, alone or jointly with others, determines the pur-
poses and means of the processing of personal data. See GDPR art
4(7).
Content courtesy of Springer Nature, terms of use apply. Rights reserved.
127AI & SOCIETY (2020) 35:113–134
1 3
6. Integrity and confidentiality—using appropriate techni-
cal or organizational measures, personal data must be
processed in a manner that ensures appropriate secu-
rity, including unauthorised or unlawful processing and
against accidental loss, destruction or damage.31
7. Restrictions on transfer—although, transfer of personal
data outside the EU does not require any specific author-
isation, transfer to third countries or international organ-
izations may take place where the (EU) Commission has
decided that such third country or organization ensure
an adequate level of protection for personal data.32
8. Accountability—The controller shall be responsible for
and be able to demonstrate compliance with the above
principles.
(b) Privacy by design and restrictions on automated deci-
sion-making
Article 25 of the Regulations mandate the implementa-
tion of privacy by design or privacy by default (PbD). The
specific provisions of the law are that data controllers shall
implement appropriate and technical measures for ensur-
ing that by default, only personal data which are necessary
for each specific purpose of processing are processed. The
obligation to implement privacy by default applies to the
amount of data collected, the extent of their processing, the
period of their storage and their accessibility. In particular,
the measures shall ensure that by default, personal data is not
accessible without individual intervention to an indefinite
number of natural persons (art 23).
The Regulations recommends pseudonymisation as an
appropriate technical and organizational measure which
meets the requirements of the regulations and protects the
rights of data subjects. However, in implementing appropri-
ate technical and organizational measures mandated by the
law, the controller shall take account of the state of the art,
the cost of implementation, and the nature, scope, context
and purpose as well as the risks and likelihood and severity
posed by personal data processing to the rights of natural
persons. Under article 22, the data subject has the right not
to be subject to a decision based solely on automated pro-
cessing, including profiling. The data also subject the right
to be informed of the existence of automated decision-mak-
ing including profiling and a right to an explanation for the
logic underlying such decisions as well as the significance
and consequences of the processing (so-called right to an
explanation).33
Although the provisions above may pose some difficul-
ties for the development and functioning of agribots, (See
notes on scope and application of data protection law below)
they are based on the (arguably correct) presumption that the
problem with automated decision-making is not so much the
inability of humans to predict the behaviour of autonomous
robots. The problem is the need for trust that the decision-
making process is transparent for accountability, reliability
and trust. As a result, the algorithms that underpin agri-
bot systems need to be as transparent and as interpretable
as possible, and the agribots must be able to explain their
behaviour in terms that humans can understand right from
how they interpreted their output to why they recommend
a particular output (so-called explanation-based collateral
systems).34
(c) Data breach reporting and administrative fines and pen-
alties
The Regulation also makes provisions for mandatory data
breach notification and empowers supervisory authorities
(national public authorities such as the Information Com-
missioner’s Office in the UK that would monitor and enforce
the Regulation) to impose administrative fines which could
be potentially large (a maximum of 20million Euros or 4%
of the global annual turnover of the preceding financial year
whichever is higher) for infringements of certain provisions
of the law.35 However, while supervisory bodies have the
power to levy fines and other sanctions, this does not pre-
clude individuals from bringing civil actions. In Vidal-Hall
v Google Inc.36 for example, the court ruled that misuse of
personal information is an actionable tort. Agribot operators
may thus need to invest in specialised secure data storage
facilities, and consider the use of cryptography to protect
data stored on and communicated by agribots in the field
to comply.
2.3 Liability undercontract
Liability can arise under a contract between different par-
ties concerning the use and operation of agricultural robots.
Contractual agreements are particularly important, because
contracts define the rights, obligations and liabilities of par-
ties and the courts will enforce the terms of a contract vol-
untarily entered into. Therefore, where permitted by law,
31 See generally GDPR art 5.
32 general principles on transfer are contained in GDPR arts 44–50.
33 See also GDPR art13(f).
34 European Parliament committee on Legal Affairs, ‘Artificial Intel-
ligence: Potential Benefits and ethical Considerations’ p 4 http://
www.europ arl.europ a.eu/RegDa ta/etude s/BRIE/2016/57138 0/IPOL_
BRI%28201 6%29571 380_EN.pdf accessed 13/03/2017.
35 See GDRP arts 33, 51, 58, 83(4) & (5).
36 (2014) EWHC 13 (QB).
Content courtesy of Springer Nature, terms of use apply. Rights reserved.
128 AI & SOCIETY (2020) 35:113–134
1 3
parties may by contract exclude or their limit liabilities. For
example, parties may by contract agree that it is the duty of
the agricultural contractor to provide training to the farmer
on the use of the agribots. They may further agree that such
training excludes or limit the manufacturer’s liability for
accidents caused by misuse. The parties may also contract
to contribute towards damages for loss of reputation which
is likely to affect the manufacturer’s brand. It is also impor-
tant to note that under rules of privity of contract, only par-
ties to the contract can acquire rights or liabilities under the
contract. Therefore, in a contract between the farmer and an
agricultural contractor for the supply of agribots to be used
for killing weeds, the farmer can only sue the agricultural
contractor if the agribot was incapable of killing weeds. He
cannot sue the manufacturer unless the manufacturer is also
a party to the contract.37
Finally, the scope of remedies under a contract is wide,
and a party can seek some reliefs for damages caused by a
breach by the other party or parties. Therefore, an innocent
party may ask to be discharged from further obligations to
the party in breach, or claim damages for loss suffered. In
cases of disputes or claims for breach of contract, courts
will usually give effect to the terms of the written agreement
between the parties without extraneous evidence. The con-
tract, therefore, serves as evidence of the intention between
the parties and must be carefully drafted particularly when
it involves multiple parties.
2.4 Relevant standards
These are thousands of voluntary technical standards which
have been established by many organizations for various
uses, which are beyond the scope of this article. The fol-
lowing is thus only a small sample of relevant standards, as
examples of a much larger collection:
BS EN 61508 “Functional safety of electrical/electronic/
programmable electronic safety-related systems” is a com-
monly used engineering safety standard which defines
“Safety Integrity Levels” (SIL), and technical safety pro-
cesses such as the use of hazard identification and mitiga-
tion, failsafes, and emergency stop systems.
BS EN 62061—implements principles of BS EN 61508
(above) and harmonised to parts of the EU Machinery Direc-
tive (ie. a “called up” standard with legal status.)
ISO 10218 “Robots and robotic devices—safety require-
ments for industrial robots.” provides best practices for
industrial robot safety. ISO 15066 “Robots and robotic
devices—collaborative robots” provides best practices for
systems involving robots and humans working together.
ISO 18497 “Safety of autonomous tractors”—is under
development at the time of writing, and aims to provide best
practices for the safety of large autonomous tractors.
3 Law andmitigation ofdamages
A number of the laws examined above appear to be strictly
worded concerning different forms of liabilities whether
these arise under contract, tort or statute. However, despite
the strictness, the law also provides defences and other legal
means through which a stakeholder in an agribot supply
chain may avoid liability or mitigate its damages:
(a) Statutory defences statutory defences are defences
allowed under the law. The relevant defences have been
discussed under the Product defect above.
(b) Defences to claims for tortious liability although, these
have also been alluded to earlier, it is useful to briefly
highlight how the manufacturer could in practice
defend an action in negligence. As noted above, manu-
facturers may be liable for breaching a duty of care
owed to users of their products, and the law places the
burden of proving the negligent act on the injured party
or the claimant. In effect, the claimant must prove that
the manufacturer did not take reasonable care to avoid
the injury or damage occurring. Conversely, it is a
defence open to the manufacturer that he could not have
reasonably foreseen the harm to the injured party and
could, therefore, not have prevented it. This is based
on the doctrine of the remoteness of damage where
the manufacturer contends that there is no causal link
between the manufacturer’s negligence and the injury
to the claimant. Also, the manufacturer can plead that
the claimant is contributorily negligent. Contributory
negligence is a partial defence which enables the neg-
ligent party (e.g., the manufacturer) to claim mitigation
of damages by proving that the claimant contributed
to his loss or injury. For example, failure to read the
instruction manual to take specific recommended steps
in circumstances where the agribot malfunctions may
lay a farmer or contractor claimant liable to a claim of
contributory negligence.
(c) Exclusion clauses where law permits it, parties may by
contract exclude liability for certain acts or omissions.
For example, the agribot manufacturer may exclude lia-
bility for illegal use of a robot or use for purposes other
than that for which robot was manufactured. Liability
may also be excluded for improper use or interference
with specifications of agribot software or algorithm
(See further notes on dual-use items below).
37 Although the contractor can bring an action in tort if this is due to
a defect in the product but only if he also suffers a damage. See notes
on product defect above.
Content courtesy of Springer Nature, terms of use apply. Rights reserved.
129AI & SOCIETY (2020) 35:113–134
1 3
(d) Regulation by design The law now actively promotes
regulation by design. Under the GDPR, data control-
lers are required to implement appropriate technical
and organizational measures and procedures in such a
way that data processing will meet the requirements of
the law. It is significant that the law allows data control-
ler to take account of state of the art in technological
development and the cost of implementation in con-
forming to the requirement (See notes above).
(e) Insurance There is no specific insurance framework
for robotics. However, insurance can be mandated by
law or by contract between the parties. Also, specialist
insurance may be required, and in this regard, it has
been proposed that insurance develop new products
and law mandate compulsory insurance scheme sup-
plemented by a fund (See further notes below).
(f) Judicial approaches The Legal regime for compensa-
tion and Judicial approaches to the award of damages
could also have a mitigating effect on liability. Courts
are careful not to expand the scope of existing liability
regime. For example, the provisions of the Compensa-
tion Act and the approach by the courts suggests that
courts may be circumspect in allowing claims for dam-
ages caused by the agribots considering its essential
economic function of killing weeds and making more
land available for farming. For example, under the
Compensation Act 2006 (UK), courts are required to
take into account the fact that allowing specific claims
may have adverse consequences for innovation and
investment in desirable activities.38 In other words, the
law considers that if it is too easy to make successful
claims concerning specific activities, the courts may be
overwhelmed with cases for compensation. This risk
often referred to as ‘opening the floodgates (of litiga-
tion)’ would inhibit investment in activities which are
useful for the society or which are of economic, social
or technological significance.
4 Grey areas relating tocurrent legal
concepts
The grey areas refer to aspects of liability in agricultural
robotics where the law is unclear or uncertain. The most
relevant issues considered here include the legal effect of the
autonomy of agribot on the liability of the parties and liabil-
ity for dual-use of the agribot as well as the likely effects of
the EU data protection law. This section highlights the main
arguments in this area and where relevant, the proposed
solutions to the challenges.
4.1 The ethics ofrobot autonomy
Under EU law, (non-autonomous) robots can be classified as
products and humans are ultimately responsible for defects,
errors, or misuse of the robot (See notes on liability for prod-
uct defect above). For autonomous robots, the applicable
laws and principles are not so clear. Directive 85/374/EEC
has no direct applicability to liability for damages caused by
autonomous robots, and there is currently no definition of
autonomous robots under EU laws.
Nevertheless, one proposal defines robot autonomy as the
ability of the robot to take decisions and implement them
in the outside world independently of external control or
influence.39 The key features of robot autonomy include the
development of autonomous and cognitive features such as
the ability to learn from experience and take independent
decisions, increasing capacity for adaptability and the exhi-
bition of emergent behaviours. In effect, if an autonomous
robot encounters difficulties that its design did not anticipate,
its actions will not always be a result of programming as its
learning abilities can cause the robot to develop sophisti-
cated interaction with the environment which leads to unpre-
dictability in its behaviour.40
Presumably, therefore, the more autonomous robots
are, the less likely they will be considered as mere tools
in the hands of other actors such as manufacturer, owner
and users.41 However, it is not always clear whether and the
extent to which robots should be autonomous. As examples,
the UK House Committee on Robotics and Artificial Intel-
ligence made the point that it is important that AI technology
is operating as intended and that unwanted, or unpredictable,
behaviours are not produced, either by accident or mali-
ciously’.42 Also, in a report by the EU, it was suggested that
it is inconceivable that once another actor no longer controls
a robot, it becomes the actor itself. The report argues further
that a robot being a mere machine and a carcass devoid of
consciousness, feelings and thoughts or its own will can-
not become an autonomous legal actor.43 This observation
arguably undermines the very notion of robot autonomy. For
example, since autonomy is taken to involve self-learning
and the processing of artificial intelligence, then a design
38 See Compensation Act 2006s 1(b).
39 Committee on Legal Affairs, ‘Draft Rules on Robotics’, Recital R.
40 Committee on Legal Affairs Draft Rules on Robotics, Recital Z.
41 Committee on Legal Affairs Draft Rules on Robotics, Recitals Q,
R, S.
42 See House of Commons, Science and Technology Committee,
‘Robotics and Artificial Intelligence’ (12 October 2016) p 16.
43 European Parliament, European Civil Law Rules in Robotics
(Study for the JURI Committee) 2016 p 13 (hereinafter EU Parlia-
ment, Civil Law Rules in Robotics).
Content courtesy of Springer Nature, terms of use apply. Rights reserved.
130 AI & SOCIETY (2020) 35:113–134
1 3
that limits that autonomy also limits the use of the robot and
could potentially stifle further innovation in robotics.
As noted in another report by the EU, however, it is
expected that ultimately AI could surpass human intellectual
capacity in a manner which, if not prepared for, could pose a
challenge to humanity’s capacity to control its creation …’44
This position suggests that robot autonomy is a given and
technical (but legitimate) questions can be raised concerning
the legal consequences of such autonomy particularly the
legal responsibility arising from a robot’s harmful action. In
a hypothetical scenario involving the agribot, the following
could occur; the obstacle avoidance on the agribot works
but the robot ‘decides’ it could overcome an obstacle. An
accident occurs, and a walker is injured. All parties deny
liability. The manufacturer argues that the accident occurred
independently as the robot was acting autonomously, the
insurers refuse to indemnify the manufacturer based on the
argument that the operation which caused the accident is
not a ‘defect’, the injured party claims that the accident is
caused by manufacturer defect regardless of robot autonomy.
The question this raises is, therefore, is whether and how a
machine can be held liable for its actions or omissions.45
Although it is not yet clear what values machines should
use, and how to embed these values in them, it has been
suggested that they should function according to values that
are aligned with those of humans and consider following,
as much as possible, ethical theories defined for humans.46
Therefore, guiding legal and ethical frameworks for the
design, production and use of robots and AI must be based
on values such autonomy, individual responsibility, informed
consent, and privacy and social responsibility.47 The propos-
als examined below are relevant in this respect.
4.1.1 Proportional liability
To promote certainty, responsibility and accountability, it
has been suggested that a set of rules be developed which
reflects the proportionality of liability depending on the
instructions given to the robot and its capacity for self-
learning as well as its level of autonomy.48 Assuming that
damage, injury or loss could be established, the following
rules of liability apply;
(a) Manufacturers and producers should be strictly liable
for damage that can be traced back to the robot’s design
such as an error in the algorithm causing injurious
behaviour.49 (See notes above on how technical designs
can aid the law in this area mainly because of attribu-
tion problems)
(b) For robots sold with open source software, liability
should in principle be on the person who programmed
the application which led to the robot causing damage.
This is increasingly being incorporated into contracts.
(c) When damage is caused when the robot is still learn-
ing, its user or owner should be held liable. However,
liability should be further governed by whether the user
is a professional user and whether or not they are the
victim. If the damage is caused to a victim who is also
a professional, this would be considered as an accident
at work covered by existing laws governing such acci-
dents. If the damage is linked to robot instruction given
by a professional user which causes damage to a third
party, then the situation calls for the development and
application of new rules
(d) In cases where the robot is hired out, the hirer should
remain liable. The rationale is that it is difficult, given
that each hirer may teach the robot different things, to
determine which hirer is responsible for the acts of the
robot.
For agribots, manufacturers and agricultural contrac-
tors are likely to fall into this category and would thus
be deemed to be liable in cases where the agribot is
hired out.
(e) Finally, future legislative instruments should provide
for the application of strict liability for damage caused
by smart robots. In effect, only proof of a causal link
between the harmful behaviour of the robot and the
damage suffered by the injured party is required. There
should also be no restrictions on the type and extent of
damages which may be recovered, and there should be
no limit on the forms of compensation which may be
offered to the aggrieved party on the sole ground that
damage was caused by a non-human agent.50
4.1.2 Legal personhood
It is possible in theory to confer legal personality on robots.
This allows the autonomous robot to have the status of an
‘electronic person’ for liability and rights.51 “Legal per-
sonhood” is a purely legal concept and is unrelated to the
concept of “personhood” in Philosophy, which has been
44 Committee on Legal Affairs Draft Rules on Robotics, Recital I.
45 Committee on Legal Affairs Draft Rules on Robotics, Recitals S.
46 Francesca Rossi, ‘Artificial Intelligence: Potential Benefits and
Ethical Considerations’ (European Parliament Legal Affairs Commit-
tee Briefing) 2016 p 4.
47 Committee on Legal Affairs, ‘Draft Rules on Robotics’, 2016 p 7.
48 EU Parliament, ‘Civil Law Rules in Robotics’, 17.
49 EU Parliament, ‘Civil Law Rules in Robotics’ p 17.
50 See generally Committee on Legal Affairs Draft Rules on Robot-
ics, p 13.
51 This is similar to the concept of corporate legal personality which
allows confers legal entities on companies and corporations, thus sep-
arating the corporation or company from its promoters, managers and
directors.
Content courtesy of Springer Nature, terms of use apply. Rights reserved.
131AI & SOCIETY (2020) 35:113–134
1 3
defined by various authors via difficult philosophical prop-
erties such as “free will” and “consciousness”. Legal person-
hood in robotics would be intended purely as a mechanism
to assign legal liability, in the same way that corporations
are sometimes considered to be legal persons. In particu-
lar, like corporate personhood, it provides a mechanism to
replace liability assigned to individual human operators
with liability assigned to some group of humans such as the
robot design team. This is important and useful, for exam-
ple, if individual human operators do not wish to take on
potential personal liability for deaths caused by the robots,
which could result in prison and other sentences on them as
individuals. Spreading the liability across the design team
via legal personhood would avoid this situation whilst still
ensuring that the responsibility still exists in a suitable form.
However, objections to robotic legal personhood have
been raised to this proposal on ethical and conceptual coher-
ence grounds. It w argued for instance that legal personhood
status for the robot would unavoidably trigger unwanted
and nonsensical legal consequences including the need to
determine what robots’ rights would be and how to respect
those rights. In theory, a robot legal person (or more likely,
a belligerent human claiming to act on its behalf, for exam-
ple to sabotage a robotics company’s product or service, as
human campaign groups currently do against animal test-
ing companies by acting on behalf of the animals) might
then be entitled to demand rights for the robot which were
originally intended only for humans legal persons, such as
employment leave, minimum wage, and refusal to work in
dangerous environments.
Although conferring rights on robots could be potentially
nonsensical, the problem only arises if the arguments are
considered from a purely economic perspective. From a
legal perspective, an artificial legal entity does not have to
be conferred with the same rights as humans. In fact, taking
the example of corporations, the law may not confer any
direct rights or duties on the entity but rather on its direct-
ing minds or promoters. Therefore, for robots, electronic
personhood would create the advantage of legal conveni-
ence such as making the robot a distinct legal entity which
can sue and be sued. It would also vest the robot with the
genuinely useful capability to apply for and obtain work or
operating licence (e.g., an agribot (or rather its designers
on its behalf) can apply for certification to use pesticides,
removing the need for operators to hold the certificate, and
transferring the liability onto the engineering design team).
Electronic personhood can also help the robot (or rather, the
human design team which it represents) fulfil obligations to
self-insure and like corporations, pay compensation to those
injured by its acts or omissions, again reducing the risk to
individual operators.
It is important to note that the robot will have to be reg-
istered in the same way as corporations and may have to be
vested or equipped with assets to enable it to carry out its
duties and obligations. The promoters of the robot will make
the choices about which party(ies) will fund the assets. More
importantly, however, despite the electronic personhood, the
court would be able to lift the veil of incorporation in appro-
priate cases to render the promoters liable for crimes and
civil wrongs committed by the robot.
4.1.3 Registration andinsurance
This is a recommendation for a system of registration for
advanced robots based on a criterion established for the clas-
sification of robots.52 A Union-wide Agency would manage
the registration which would serve the purpose of trace-
ability for robotics and artificial intelligence.53 Similarly,
the proposal for insurance advocates the establishment of
an insurance scheme which obliges the producer to take
out insurance for the autonomous robot it produces. It is
proposed that a fund supplements the obligatory insurance
scheme to ensure that damages can be compensated for in
cases where no insurance cover exists.54
4.2 Dual‑use products
EU law regulates Dual-use products.55 The Regulation sets
up a Community regime for the control of exports, transfer,
brokering and transit of dual-use items and aims to control
trade in dual-use items to counter the proliferation of weap-
ons of mass destruction and other items of potential mili-
tary use.56 Therefore, the Regulation requires that Dual-use
items (including software and technology) should be subject
to effective control when they are exported from the Euro-
pean Community.57 Dual-use items are defined as ‘…items,
including software and technology, which can be used for
both civil and military purposes, and shall include all goods
which can be used for both non-explosive uses and assist
in any way in the manufacture of nuclear weapons or other
nuclear explosive devices’. 58 Annexe 1 to the Regulation
contains a list of dual-use items including nuclear materi-
als (e.g., uranium), telecommunications and information
52 Committee on Legal Affairs Draft Rules on Robotics, p 13.
53 Committee on Legal Affairs Draft Rules on Robotics p 13.
54 Committee on Legal Affairs Draft Rules on Robotics p 13.
55 See COUNCIL REGULATION (EC) No 428/2009 of 5 May 2009
setting up a Community regime for the control of exports, trans-
fer, brokering and transit of dual-use items (hereinafter Regulation
428/2009).
56 See EU Parliament, ‘Implementation Appraisal Control of Trade
in Dual Use Items’ (…Committee briefing 2016).
57 Regulation 428/2009, Recitals 2, 3.
58 Regulation 428/2009, art 2.
Content courtesy of Springer Nature, terms of use apply. Rights reserved.
132 AI & SOCIETY (2020) 35:113–134
1 3
security, sensors and lasers, various software, machine tools,
chemical manufacturing equipment.
The law requires dual-use items to be registered and sub-
ject to authorisation and export control including a detailed
register of exports (art 20) and a review and update as well
as impact assessment of dual-use items.
It is notable that the law can be extended to products
with potential dual-use that are not listed in Annex 1 to the
Regulations (art 4, 15). In fact, in its draft rules on robot-
ics, the EU legal committee recommends that the provisions
on dual-use regulations should apply to robots.59 Perhaps,
because the Regulation intends to ensure that dual-use items
do not get into the hands of malicious actors, it only imposes
liability on manufacturers for non-compliance with relevant
provisions. However, new issues on liability can arise in the
use and operation of the agribot. To illustrate, as agricultural
robots are designed to operate in harsh outdoor conditions,
they may bear functional similarities to, and be repurposable
as, military systems such as explosive ordnance disposal
(EOD), reconnaissance, and weaponised platforms. It is,
therefore, conceivable that in the wrong hands they could be
used to commit crimes including acts of terrorism, such as
delivering lethal substances or weapons into crowded areas.
It is clear on the one hand that the malicious actor or
any other person(s) that repurposed the agribot to carry
out the criminal or terrorist acts would be deemed to have
committed a crime for which he would be liable to punish-
ment upon conviction. Also, he could be liable for dam-
ages to the parties thereby injured in a civil action. On the
other hand, it is not clear whether the manufacturer bears
(or should bear) any liability. As already noted above, the
EU Directive on Product Defect applies only to defective
products; that is, products not providing the safety to which
a consumer is entitled. It is also notable that one of the
factors to be taken into account - in determining whether a
product is defective or not- include whether the product is
being put to reasonable use.60 However, while unreason-
able use can give rise to mitigation of damages, it does not
entirely absolve the manufacturer of liability, and the prob-
lem can become particularly complex if such re-purposing
is foreseeable or can be anticipated by the manufacturer.
Under the EU Machinery Directive, for instance, the con-
tents of the instructions must cover not only the intended
use of the machinery but also take into account any reason-
ably foreseeable misuse.61
The question, therefore, is what uses should be deemed
reasonably foreseeable? For example, is it reasonably
foreseeable that an agribot can be used for criminal or ter-
rorist purposes? If the answer is yes, then how is the posi-
tion different from using a kitchen knife to commit mur-
der. The knife is sold as a kitchen utensil, not as a weapon,
so although the manufacturer can reasonably foreSee that
the knife could be used for heinous crimes, he is not held
responsible for the murderer’s action. Arguably, the argu-
ment would be different if the robot was developed purely
for the purpose of committing crimes—such as a modi-
fied agricultural robot with a new implement attachment
designed specifically for breaking and entering domestic
windows and with no other clear function—then respon-
sibility can lay with the manufacturer if the robot is
repurposed for further criminal or unlawful purpose. As
described in the discussion of health and safety law in
Sect.2, the resolution of this discussion is largely depend-
ent on whether firstly, practicable risk reduction measures
(i.e., what can be done about the reasonably foreseeable
hazards) are readily identifiable, and secondly, whether
implementation of those measures is reasonable. In the
case of the knife, a well-established implement for which
many examples of good practice design are available, it is
unlikely that further risk reduction measures are practica-
ble (i.e., technically feasible) that have not already been
tried and their relative virtue exhaustively evidenced. In the
case of the agricultural robot, the industry is still subject
to errors in internal communication, for example, due to
intellectual property protection, lack of established industry
groups and forums, and general lack of publicly available
evidence of safety improvements; therefore, the identifica-
tion of practical risk reduction measures and the reduction
in risk associated with reasonably foreseeable hazards may
not be straightforward for designers.
Furthermore, in what ways should the instructions take
into account reasonably foreseeable misuse? For example,
the fact that instructions expressly prohibit certain re-pro-
gramming or re-purposing would hardly deter a malicious
actor bent on misusing the agribot. These issues would need
to be addressed when developing rules applicable to robotics
particularly small robots like the agribot.
4.3 Scope andapplication ofdata protection law
It was noted earlier that the new EU Regulations on data
protection make significant provisions that would impact on
developments in robotics and AI. While Much of the provi-
sions address gaps in the law, they also raise difficult ques-
tions on the scope of the law and its impact and applicabil-
ity to robotics. As examples, article 25 now makes PbD a
legal standard, and arguably enhances the protection of indi-
vidual privacy. However, given the rapidly evolving tech-
nology environment, and the fact that vulnerabilities and
susceptibilities (to privacy infractions) may only become
59 Committee on Legal Affairs ‘Draft Rules on Robotics’ item 34 p
12.
60 See notes on product defects above.
61 See Directive 2006/42/EC, item 1 of Annex 1.
Content courtesy of Springer Nature, terms of use apply. Rights reserved.
133AI & SOCIETY (2020) 35:113–134
1 3
known subsequent to use or operation of products, as well
as the expansive and ambulatory nature of the concept of
privacy,62 the question must be asked whether it is possible
(even using state of the art) to identify and assess all privacy
implications and dimensions of particular technologies?
Furthermore, under article 22 relating to algorithms that
make decisions based on user-level predictors which sig-
nificantly affect users, the law effectively creates a ‘right
to explanation’. This entitles users to ask for an explana-
tion of an algorithmic decision that was made about them
(See previous notes above). Although decisions based on
algorithms raise difficult ethical and privacy questions, the
provision also poses significant challenges for the AI and
machine learning community. As examples, it is a common
misconception that complex algorithms always do what
their designers choose to have them do when in fact, it is
difficult to understand, predict, and explain the behaviour
of advanced AI systems because of the complexity of the
systems and the large volume of data they use.63 Also, from
a technical perspective, a requirement that algorithms offer
explanations for their underlying decisions could potentially
prohibit the algorithms currently in use. This means to com-
ply with the law, a complete overhaul of standard and widely
used algorithmic techniques may be required.64
Finally, while the GDPR applies directly to all EU mem-
ber states, it is unclear, given the uncertain political terrain
precipitated by Brexit, how and the extent to which the law
would apply to the UK. For example, even if the UK adopts
the GDPR, (which will take effect before the UK exits the
EU), will the UK be bound to continue to implement the
GDPR and its subsequent amendments? What would be the
effect of the opinions, studies etc. conducted by the EU on
the formulation of policies on robotics, machine learning, AI
and cognitive computing in the future? More importantly,
since the GDPR now establishes both a European Data Pro-
tection Board (EDPB) and national supervisory authorities,
what are the effects of the multiple (or at least dual) admin-
istrative and compliance regimes that the Brexit could poten-
tially create for the AI and robotics community in the EU?
5 Conclusion
The liability regime which applies to the use and opera-
tion of the agribot appears to be complicated. However, an
essential aspect of this regime is that parties have different
rights and obligations under different laws which make it
possible to distribute liabilities. The law also allows defences
which are particularly specific and relevant for promoting
developments in technology. More crucially, where permit-
ted by law, parties may re-allocate liabilities and claim con-
tributions for damages arising from accidents involving the
agribot.
The outstanding issue requiring consideration is how
autonomy should be defined in the context of the operation
of the agribot. Unless law, policy or (for present purposes)
contracts define the scope of the autonomy of the robot,
the liability regime may be challenged by technical legal
arguments.
Acknowledgements The authors would like to thank Ed Thomas at
Risktec Solutions Ltd and the anonymous reviewers for their useful
comments. This research was supported in part by the InnovateUK
project IBEX2: “Autonomous robot weed spraying for less favoured
areas”, grant number 131790.
Open Access This article is distributed under the terms of the Crea-
tive Commons Attribution 4.0 International License (http://creat iveco
mmons .org/licen ses/by/4.0/), which permits unrestricted use, distribu-
tion, and reproduction in any medium, provided you give appropriate
credit to the original author(s) and the source, provide a link to the
Creative Commons license, and indicate if changes were made.
References
Anderson JM, Nidhi K, Stanley KD, Sorensen P, Samaras C, Oluwatola
OA (2014) Autonomous vehicle technology: a guide for policy-
makers. Rand Corporation
Andrieu C, Freitas ND, Doucet A, Jordan MI (2003) An introduction to
mcmc for machine learning. Mach Learn 50(1–2):5–43
Bac CW, Henten EJ, Hemming J, Edan Y (2014) Harvesting robots for
high-value crops: state-of-the-art review and challenges ahead. J
Field Robot 31(6):888–911
Beck S (2016) The problem of ascribing legal responsibility in the case
of robotics. AI Soc 31(4):473–481
Beiker SA (2012) Legal aspects of autonomous driving. Santa Clara
L Rev 52:1145
Bernardo JM, Smith AFM (2001) Bayesian theory
Binch A, Fox CW (2017) Controlled comparison of machine vision
algorithms for Rumex and Urtica detection in grassland. Comput
Electron Agric 140:123–138
Bishop CM (2006) Pattern recognition and machine learning. Springer,
New York. ISBN 978-0-387-31073-2. https ://www.sprin ger.com/
us/book/97803 87310 732
Blackmore S, Godwin RJ, Fountas S (2003) The analysis of spatial
and temporal trends in yield map data over six years. Biosyst Eng
84(4):455–466
Blackmore S, Griepentrog HW, Nielsen H, Nørremark M, Resting-
Jeppesen J (2004) Development of a deterministic autonomous
tractor. In Proceedings CIGR, vol. 11, p 2004
62 for example, privacy can be relative to context, societies and even
technologies.
63 Executive Office of the president National Science and Technol-
ogy Council Committee on Technology, ‘Preparing for the Future
of Artificial intelligence’ (2016) p 31; See also Arnold v Reuther 92
So. 2d 595, 596 where the court appeared to sanction this notion of
autonomy when it suggested that liability in cases of autonomous
vehicles will be higher because they (the vehicles) raise the presump-
tion that their programming is accurate.
64 Bryce Goodman and Seth Flaxman, ‘European Union Regulations
on Algorithmic Decision-making and a Right to Explanation’ p 1.
Content courtesy of Springer Nature, terms of use apply. Rights reserved.
134 AI & SOCIETY (2020) 35:113–134
1 3
Brodsky JS (2016) Cyberlaw and venture law: Autonomous vehicle
regulation: how an uncertain legal landscape may hit the brakes
on self-driving cars. Berkeley Tech LJ 31:851–1169
Chua PY, Ilschner T, Caldwell DG (2003) Robotic manipulation of
food products—a review. Ind Robot Int J 30(4):345–354
Cooper GF (1990) The computational complexity of probabilistic infer-
ence using Bayesian belief networks. Artif Intell 42(2–3):393–405
Douma F, Palodichuk SA (2012) Criminal liability issues created by
autonomous vehicles. Santa Clara L Rev, 52:1157
Dr. Mike Webster (2017) HSE Improvement and Prohibition Notices:
what do they tell us about CDM 2015 and construction health and
safety? V1.0 November
Dvorak J (2016) An autonomous, solar-powered tractor. 2016. Ameri-
can Society of Agricultural and Biological Engineers
Escolà A, Rosell-Polo JR, Planas S, Gil E, Pomar J, Camp F, Llorens
J, Solanelles F (2013) Variable rate sprayer. Part 1—orchard pro-
totype: design, implementation and validation. Comput Electron
Agric 95:122–135
Fox CW, Roberts SJ (2012) A tutorial on variational bayesian infer-
ence. Artif Intell Rev 38(2):85–95
Guizzo E (2011) How Google’s self-driving car works. IEEE Spectrum
Online, October 18, 2011
Health and Safety Executive (2001) (R2P2) Reducing risks, protecting
people – HSE’s decision-making process
Ishida M, Imou K, Okado A, Takenaga H, Honda Y, Itokawa N,
Shibuya Y (1998) Autonomous tractor for forage production. J
Jpn Soc Agric Mach 60(2):59–66
Levitt TS, Laskey KB (2000) Computational inference for evidential
reasoning in support of judicial proof. Cardozo L Rev 22:1691
Marino D, Tamburrini G (2006) Learning robots and human responsi-
bility. Int Rev Inf Ethics 6(12):46–51
Michio K, Noguchi N, Ishii K, Terato H (2002) The development of
the autonomous tractor with steering controller applied by opti-
mal control. In: Automation technology for off-road equipment
proceedings of the 2002 conference, p 367. American Society of
Agricultural and Biological Engineers
Pedersen SM, Fountas S, Have H, Blackmore BS (2006) Agricultural
robots—system analysis and economic feasibility. Precis Agric
7(4):295–308
Pinto C (2012) How autonomous vehicle policy in California and
Nevada addresses technological and non-technological liabilities.
Intersect Stanf J Sci Technol Soc 5
Singh N, Shaligram AD (2014) NPK measurement in soil and auto-
matic soil fertilizer dispensing robot. Int J Eng Res Technol
3:635–637 (ESRSA Publications)
Thrun S, Burgard W, Fox D (2005) Probabilistic robotics. MIT press,
Cambridge
Transparency Market Research (2017) Agriculture robots market—
global industry analysis, size, share, growth, trends and forecast
2016–2024. Transparency Market Research
Content courtesy of Springer Nature, terms of use apply. Rights reserved.
1.
2.
3.
4.
5.
6.
Terms and Conditions
Springer Nature journal content, brought to you courtesy of Springer Nature Customer Service Center GmbH (“Springer Nature”).
Springer Nature supports a reasonable amount of sharing of research papers by authors, subscribers and authorised users (“Users”), for small-
scale personal, non-commercial use provided that all copyright, trade and service marks and other proprietary notices are maintained. By
accessing, sharing, receiving or otherwise using the Springer Nature journal content you agree to these terms of use (“Terms”). For these
purposes, Springer Nature considers academic use (by researchers and students) to be non-commercial.
These Terms are supplementary and will apply in addition to any applicable website terms and conditions, a relevant site licence or a personal
subscription. These Terms will prevail over any conflict or ambiguity with regards to the relevant terms, a site licence or a personal subscription
(to the extent of the conflict or ambiguity only). For Creative Commons-licensed articles, the terms of the Creative Commons license used will
apply.
We collect and use personal data to provide access to the Springer Nature journal content. We may also use these personal data internally within
ResearchGate and Springer Nature and as agreed share it, in an anonymised way, for purposes of tracking, analysis and reporting. We will not
otherwise disclose your personal data outside the ResearchGate or the Springer Nature group of companies unless we have your permission as
detailed in the Privacy Policy.
While Users may use the Springer Nature journal content for small scale, personal non-commercial use, it is important to note that Users may
not:
use such content for the purpose of providing other users with access on a regular or large scale basis or as a means to circumvent access
control;
use such content where to do so would be considered a criminal or statutory offence in any jurisdiction, or gives rise to civil liability, or is
otherwise unlawful;
falsely or misleadingly imply or suggest endorsement, approval , sponsorship, or association unless explicitly agreed to by Springer Nature in
writing;
use bots or other automated methods to access the content or redirect messages
override any security feature or exclusionary protocol; or
share the content in order to create substitute for Springer Nature products or services or a systematic database of Springer Nature journal
content.
In line with the restriction against commercial use, Springer Nature does not permit the creation of a product or service that creates revenue,
royalties, rent or income from our content or its inclusion as part of a paid for service or for other commercial gain. Springer Nature journal
content cannot be used for inter-library loans and librarians may not upload Springer Nature journal content on a large scale into their, or any
other, institutional repository.
These terms of use are reviewed regularly and may be amended at any time. Springer Nature is not obligated to publish any information or
content on this website and may remove it or features or functionality at our sole discretion, at any time with or without notice. Springer Nature
may revoke this licence to you at any time and remove access to any copies of the Springer Nature journal content which have been saved.
To the fullest extent permitted by law, Springer Nature makes no warranties, representations or guarantees to Users, either express or implied
with respect to the Springer nature journal content and all parties disclaim and waive any implied warranties or warranties imposed by law,
including merchantability or fitness for any particular purpose.
Please note that these rights do not automatically extend to content, data or other material published by Springer Nature that may be licensed
from third parties.
If you would like to use or distribute our Springer Nature journal content to a wider audience or on a regular basis or in any other manner not
expressly permitted by these Terms, please contact Springer Nature at
onlineservice@springernature.com
... However, it is essential to highlight that it is a key aspect when collecting data from farmers because it is necessary to guarantee transparency and generate trust among the other parties involved to achieve adoption processes [93]. Research conducted by [11] raises the need to define legal frameworks for the adoption of precision agriculture technologies, especially robots, due to legal aspects such as the operation and use of these devices, as well as privacy and the use of information collected by these devices, showing that there are still some gray areas in the current global legislature that could promote and facilitate the adoption and investment of this type of technology [11]. ...
... However, it is essential to highlight that it is a key aspect when collecting data from farmers because it is necessary to guarantee transparency and generate trust among the other parties involved to achieve adoption processes [93]. Research conducted by [11] raises the need to define legal frameworks for the adoption of precision agriculture technologies, especially robots, due to legal aspects such as the operation and use of these devices, as well as privacy and the use of information collected by these devices, showing that there are still some gray areas in the current global legislature that could promote and facilitate the adoption and investment of this type of technology [11]. ...
Article
Full-text available
Access to food products is becoming more and more complex due to population growth, climate change, political and economic instability, disruptions in the global value chain, as well as changes in consumption dynamics and food insecurity. Therefore, agri-food chains face increasingly greater challenges in responding to these dynamics, where the digitalization of agri-food systems has become an innovative alternative. However, efforts to adopt and use the technologies of the fourth industrial revolution (precision agriculture, smart agriculture, the Industrial Internet of Things, and the Internet of Food, among others) are still a challenge to improve efficiency in the links of production (cultivation), processing (food production), and final consumption, from the perspective of the implementation of Food Informatics technologies that improve traceability, authenticity, consumer confidence, and reduce fraud. This systematic literature review proposes the identification of barriers and enablers for the implementation of Food Informatics technologies in the links of the agri-food chain. The PRISMA methodology was implemented for the identification, screening, eligibility, and inclusion of articles from the Scopus and Clarivate databases. A total of 206 records were included in the in-depth analysis, through which a total of 34 barriers to the adoption of Food Informatics technologies (13 for the production link, 12 for the processing link, and 9 for the marketing link) and a total of 27 enablers (8 for the production link, 11 for the processing link, and 8 for the marketing link) were identified. Among the barriers analogous to the three links analyzed are privacy and information security and high investment and maintenance costs, while the analogous enablers are mainly government support.
... While autonomous vehicles, including laser weeders, have gained acceptance in certain experimental locations for on-road use, they pose unique challenges when operating in agricultural environments. Small autonomous agricultural robots, such as laser weeders, are subject to safety guidelines to ensure their responsible use on both private and public properties [84,85]. ...
Article
Full-text available
Weed infestations pose significant challenges to global crop production, demanding effective and sustainable weed control methods. Traditional approaches, such as chemical herbicides, mechanical tillage, and plastic mulches, are not only associated with environmental concerns but also face challenges like herbicide resistance, soil health, erosion, moisture content, and organic matter depletion. Thermal methods like flaming, streaming, and hot foam distribution are emerging weed control technologies along with directed energy systems of electrical and laser weeding. This paper conducts a comprehensive review of laser weeding technology, comparing it with conventional methods and highlighting its potential environmental benefits. Laser weeding, known for its precision and targeted energy delivery, emerges as a promising alternative to conventional control methods. This review explores various laser weeding platforms, discussing their features, applications, and limitations, with a focus on critical areas for improvement, including dwell time reduction, automated navigation, energy efficiency, affordability, and safety standards. Comparative analyses underscore the advantages of laser weeding, such as reduced environmental impact, minimized soil disturbance, and the potential for sustainable agriculture. This paper concludes by outlining key areas for future research and development to enhance the effectiveness, accessibility, and affordability of laser weeding technology. In summary, laser weeding presents a transformative solution for weed control, aligning with the principles of sustainable and environmentally conscious agriculture, and addressing the limitations of traditional methods.
... These regulations should offer guidance on the operation, safety requirements, and oversight of autonomous field robots. Flexibility is crucial to accommodate technological advancements while ensuring safety and compliance standards are maintained (Basu et al. 2020). Secondly, adopting a risk-based regulatory approach is recommended. ...
Technical Report
Full-text available
Introduction: In recent decades, advancements in technology have started to transform the agriculture industry. Field robots offer a promising solution to enhance efficiency, productivity, and sustainability. However, ambiguous, restrictive legislative boundaries in Czechia, such as confined operating areas or constant operator supervision, hinder the widespread adoption of autonomous agricultural technologies. This policy paper advocates for updating legislative frameworks to unlock the full potential of autonomous technologies in Czech agriculture, fostering a more efficient, sustainable, and prosperous future for farming communities. Why support automation: The current legislative framework governing the use of autonomous field robots in Czech agriculture presents challenges that hinder their widespread adoption and integration. Ambiguity and inconsistency in regulations have impeded innovation in the sector. However, there is a clear path forward. The policy recommendations advocate for clear regulations, risk-based approaches, R&D support, stakeholder collaboration, and adoption incentives to foster a supportive environment for autonomous agriculture. By implementing these recommendations, Czechia can unlock the transformative potential of autonomous technologies, driving innovation, improving competitiveness, and promoting sustainability in the agricultural sector. Collaboration between government, industry, and academia will be key in achieving this vision and positioning Czechia as a leader in agricultural automation.
... alternative pest control systems such as mechanical weed control through camara-controlled systems) (Finger, 2023). However, the adoption of technologies like autonomous robots is still low and legal regulations pose challenges for their application (Basu et al., 2020;Oliveira et al., 2021). Consequently, there is a need for unbiased assessments to determine the benefits and drawbacks of these technologies in real-life settings among stakeholders (Tamirat et al., 2023). ...
Article
Full-text available
CONTEXT: Intensive food and feed production in sole-cropped, large fields with high fertilizer and pesticide inputs to achieve high yields, has contributed to detrimental environmental impacts. To move towards more sustainable agricultural landscapes, cropping system diversification has been suggested as a promising practice for which the use of digital technologies could be potentially beneficial. Understanding the impact of diversified, newly arranged cropping systems and their management in a landscape context requires long-term experimental data at the landscape scale and practical experiences in using digital technologies which are hardly available. Experimental platforms in an agricultural landscape setup with farmers’ involvement could meet such demands but have not been set up in many regions nor has the process of designing such platforms been described systematically. OBJECTIVE: The overall objective of this study was to describe how an experimental platform can be co-designed jointly by researchers and practitioners to study and understand the impact of diversification practices compared to current cropping systems in Eastern Brandenburg, Germany. Specifically, we aimed to re-design an intensively managed field into smaller field segments that we called patches and to assess the potential of a co-created landscape experiment for sustainable agricultural production focussing on both, the practitioners´ and scientists´ perspective. METHODS: We used the DEED research cycle (Describe, Explain, Explore and Design) as a conceptual framework to co-design the landscape experiment called patchCROP within a commercial farm. Patches were implemented as 0.5 ha fields within the original field based on yield and soil maps using advanced cluster analysis which considered soil heterogeneity and topography. The original narrow crop sequence was diversified by integrating new crops, cover crops and flower strips for a five-year crop rotation. To cultivate the patches, large machinery were during the first years but will be replaced over time with autonomous field robots. Workshops and various methods such as a SWOT analysis were used, to adjust the management practices towards pesticide reduction. RESULTS AND CONCLUSIONS: The SWOT analysis revealed opportunities and drawbacks to develop such a research platform in a participative manner from both the scientific and practical farming perspective. We found that the farmer-centric position focused mainly on the economic return and feasibility of future field operations in a spatio-temporal diversified field. The scientific perspective on the other hand described needs and potentials about the research process for evaluating dynamic, interdependent or opposing natural processes and their interactions like productivity, biodiversity and ecosystem service changes in an agricultural landscape context. SIGNIFICANCE: Co-designed landscape experiments have the potential to simultaneously assess the impact of newly developed cropping systems on biodiversity and ecosystem services beyond the field level, crop performance and soil quality at multiple scales, and the implications for multiple actors. This is a step forward to extend systems-based research from single plot to landscape research in an on-farm environment, allowing the exploration of diversification measures with new digital technologies in the long run.
Chapter
In the context of smart agriculture, new and advanced machines are emerging to be implemented in the agricultural field, such as mobile field robots. These systems are usually lightweight machines compared to conventional tractors and can be categorized according to their structural components, functionalities, and applications. In addition, both commercial systems and academic prototypes have been developed and advanced in recent years. The reasons that influence the low adoption of commercial mobile robots have been investigated, as well as the challenges that need to be addressed. In addition, the factors influencing the adoption of agricultural robots have been investigated. However, less attention has been paid to the specific capabilities that a mobile field robot should have from the farmer's point of view, and for which agricultural activities it should be used profitably. Therefore, the objective of this study is to define the needs and requirements of farmers for mobile agricultural field robots. A survey method was used to obtain responses from farmers from different areas and specializations. The questionnaire, composed of three sections, was deployed both online and as paper printed version to Italian farmers. The first section collected general information about the farmer, farm type, and equipment currently used. The second one focused on the farmers’ knowledge of the mobile field robot, providing also an exemplificative video at the ending. The third one collected the needs and feedback of the farmers for robots. A total of 51 farmers responded to the survey indicating a good knowledge of mobile field robot. Considering farmers’ needs, variable rate technology and tillage operations were identified as the best practices that can be effectively implemented by field robots. The key features indicated were fast recharge time and an autonomy of 7-9 hours for electric robots, as well as a minimum forward speed of 2-5 kmh-1. The primary issue for investing in agricultural robots were to reduce labor costs, increase farm profitability, and prioritize worker safety. In contrast, the availability of incentives and environmental sustainability appear to be less important factors. The main concerns were related to the need for specialized knowledge, ensuring safe operating practices, and the investment costs associated with the technology.
Article
Full-text available
This review focused on the inventory of current digital technologies available on the agricultural market in Germany. A total of 189 digital technologies were found as of December 2023. Digital technologies in agriculture rarely contain few components. They consist of various other technologies that have many common interfaces. Therefore, a classification on two levels was done: technologies categorized according to their type (software-based and hardware-based technologies) and mode of operation (farm management information systems/ decision support systems, digital technologies for guidance and steering, digital information platforms, citizen science applications and platforms, sensors, field robots and unmanned aerial vehicles). Furthermore, the expected potentials of these digital tools for the promotion of nature conservation and ecosystem service provisioning in Germany were framed. The review also discusses barriers that can impact nature conservation and ecosystem service provisioning. Germany, as one of the world’s leading nations in the production and use of modern technologies, had set ambitious goals regarding digitalisation as a solution for nature conservation and ecosystem service provisioning problems, which have not yet been fulfilled. The potentials for nature conservation and ecosystem service provisioning are still strongly supressed by non-sustainable barriers, e.g., high acquisition costs, practical maturity, mode of operation and infrastructure. Current policies and societal preferences are not yet contributing enough to steer the use of digital technologies in a direction of nature conservation and providing ecosystem services. Furthermore, the main participants in the digitalisation discussion are researchers, whereby the smallest group of participants are farmers. For a sustainable digital transformation of agriculture, including restoration and protection targets of nature, and ecosystems, more wide-ranging, and diversifying changes supported by digitalisation are needed along agricultural and ecological concepts leading to long-term resilience of agricultural systems.
Chapter
Green chemistry is a scientific field that strives to work at the molecular level to achieve sustainability. Central to green chemistry is a set of twelve principles based on the minimization of toxic solvents in chemical processes. The relation between green chemistry and achieving the sustainable development goals set by the United Nations, the value of green chemistry and sustainable development, and an economic model based on green chemistry principles are examined in this chapter. The future of green chemistry and sustainable development as well as implementation strategies are also suggested. Green chemistry principles include, prevention, atom economy, less hazardous chemical syntheses, designing safer chemicals, safer solvents and auxiliaries, use of renewable feedstocks, design for energy efficiency, reduce derivatives, catalysis, design for degradation, real-time analysis for pollution prevention, and inherently safer chemistry for accident prevention. Based on green chemistry principles, several potential green business such as green transportation, green building, green energy, green chemicals, green agriculture, and green tourism are addressed in this chapter. More detailed discussions will be expressed from Chaps. 13–17 in this book. In summary, green Chemistry is the molecular science of sustainability to design chemical products/processes that reduce the use/generation of hazardous substances, while ensuring their performance, cost and safety. It should be noted that green chemistry and green engineering are only scientific solutions for cleaner production. It would inevitably create strong interaction with end-users and markets. Integration of green chemistry and green engineering with prevention-assurance sustainability can simultaneously accelerate economic growth and the reduced environmental and social impacts.
Article
Full-text available
Research in relation to land technology should be conducted guided by concerns for environmental sustainability. There must be a robust framework that regulates land use and development, taking into account changes in the environment due to biological, human-made substances and other factors. The research has shown how important it is to have all stakeholders involved in regulation process through Integrated Stakeholder Engagement Approach (ISEA) which is unique. Proposed ISEA approach can bring together different stakeholders including government agencies, corporate experts, environmental advocates as well as community groups to come up with appropriate regulatory frameworks. One model of reducing the impacts of a building is to build it on a strong foundation. It is necessary today for such studies to include simulation assessment so as to evaluate the effectiveness of the regulatory system. This review examines possible outcomes and environmental implications associated with specific regulations based on certain zoning policies are useful for decision-making and policy choices. By minimizing pollution while using simulation analysis techniques among different stakeholders, this initiative wants to facilitate resilient sustainable land improvement.
Article
Full-text available
The recent development of robotics poses new challenges for the legislature as well as for jurisprudence. Especially, the ascription of responsibility to a specific individual becomes more difficult when confronted with an autonomous, learning and decision-making robot. A discussion about how to solve the problems of the wronged party having to prove the cause of the damage and the person responsible for it has to take place. One possible solution could be to ascribe a specific legal status to autonomous machines, similar to the status of legal persons (corporations). Discussing responsibility in this context should also include the question of the consequences of us intentionally handing over decision-making onto machines. This transfer as well as the development of robotics as such will have repercussions on the normative concepts our society is based upon. The space for these changes has to be created consciously.
Article
Automated robotic weeding of grassland will improve the productivity of dairy and sheep farms while helping to conserve their environments. Previous studies have reported results of machine vision methods to separate grass from grassland weeds but each use their own datasets and report only performance of their own algorithm, making it impossible to compare them. A definitive, large-scale independent study is presented of all major known grassland weed detection methods evaluated on a new standardised data set under a wider range of environment conditions. This allows for a fair, unbiased, independent and statistically significant comparison of these and future methods for the first time. We test features including linear binary patterns, BRISK, Fourier and Watershed; and classifiers including support vector machines, linear discriminants, nearest neighbour, and meta-classifier combinations. The most accurate method is found to use linear binary patterns together with a support vector machine.
Article
We summarize the potential impact that the European Union's new General Data Protection Regulation will have on the routine use of machine learning algorithms. Slated to take effect as law across the EU in 2018, it will restrict automated individual decision-making (that is, algorithms that make decisions based on user-level predictors) which "significantly affect" users. The law will also create a "right to explanation," whereby a user can ask for an explanation of an algorithmic decision that was made about them. We argue that while this law will pose large challenges for industry, it highlights opportunities for machine learning researchers to take the lead in designing algorithms and evaluation frameworks which avoid discrimination.
Book
For the past hundred years, innovation within the automotive sector has created safer, cleaner, and more affordable vehicles, but progress has been incremental. The industry now appears close to substantial change, engendered by autonomous, or "self-driving," vehicle technologies. This technology offers the possibility of significant benefits to social welfare — saving lives; reducing crashes, congestion, fuel consumption, and pollution; increasing mobility for the disabled; and ultimately improving land use. This report is intended as a guide for state and federal policymakers on the many issues that this technology raises. After surveying the advantages and disadvantages of the technology, RAND researchers determined that the benefits of the technology likely outweigh the disadvantages. However, many of the benefits will accrue to parties other than the technology's purchasers. These positive externalities may justify some form of subsidy. The report also explores policy issues, communications, regulation and standards, and liability issues raised by the technology; and concludes with some tentative guidance for policymakers, guided largely by the principle that the technology should be allowed and perhaps encouraged when it is superior to an average human driver.
Article
An autonomous tractor was developed for the purpose of saving labors in forage production. Precise recognition of the position and direction of the vehicle is required for autonomous driving. In this system, two internal sensors (a fiber optic gyroscope and an ultrasonic Doppler velocimeter) were mounted on the tractor to detect the yaw angular velocity and the running speed. The tractor position was calculated by integrating those outputs. The steering was operated by a personal computer under the combination of on-off and proportional control methods. The results of 50m straight running tests on a flat field at the speed of 4 steps from 0.7m/s to 1.8m/s showed that the maximum lateral displacement from the reference line was less than 10cm. In the case of 400m running at the speed of 1.2m/s for straight running and 0.8m/s for turning, whitch was carried out to demonstrate the practical application, the maximum displacement from the predetermined path was about 1.5m.