Available via license: CC BY-NC-ND 4.0
Content may be subject to copyright.
Toward Precise Robotic Weed Flaming Using a Mobile Manipulator with a
Flamethrower
Di Wang, Chengsong Hu, Shuangyu Xie, Joe Johnson, Hojun Ji, Yingtao Jiang,
Muthukumar Bagavathiannan, and Dezhen Song
Abstract— Robotic weed flaming is a new and environmen-
tally friendly approach to weed removal in the agricultural field.
Using a mobile manipulator equipped with a flamethrower, we
design a new system and algorithm to enable effective weed
flaming, which requires robotic manipulation with a soft and
deformable end effector, as the thermal coverage of the flame
is affected by dynamic or unknown environmental factors such
as gravity, wind, atmospheric pressure, fuel tank pressure, and
pose of the nozzle. System development includes overall design,
hardware integration, and software pipeline. To enable precise
weed removal, the greatest challenge is to detect and predict
dynamic flame coverage in real time before motion planning,
which is quite different from a conventional rigid gripper in
grasping or a spray gun in painting. Based on the images from
two onboard infrared cameras and the pose information of the
flamethrower nozzle on a mobile manipulator, we propose a new
dynamic flame coverage model. The flame model uses a center-
arc curve with a Gaussian cross-section model to describe the
flame coverage in real time. The experiments have demonstrated
the working system and shown that our model and algorithm
can achieve a mean average precision (mAP) of more than 76%
in the reprojected images during online prediction.
I. INTRODUCTION
Weed infestation is an important and perpetual problem in
agriculture. Weeds compete with crops for nutrients, water,
and sunlight. Manual removal of weeds is labor-intensive,
and large-scale mechanized herbicide spray has significant
adverse impacts on the environment. Robotic removal of
weeds provides a new approach to this old problem. Here,
we are interested in developing environmentally friendly
solutions to control weeds in early growth stages. More
precisely, we report our progress in developing a robotic
weed-flaming solution using a mobile manipulator equipped
with a flamethrower (see Fig. 1).
To enable precise robotic weed flaming, new system and
algorithm developments are necessary. System development
includes overall design, hardware integration, and software
pipeline. On the algorithm side, to enable precise weed re-
moval, the greatest challenge is to detect and predict dynamic
flame coverage in real time before motion planning. This is
quite different from a conventional rigid gripper in grasping
or a spray gun in painting because the thermal coverage of
the flame is affected by dynamic or unknown environmental
D. Wang, S. Xie, Y. Jiang, and D. Song are with Department of Computer
Science and Engineering, Texas A&M University. D. Song is also with
Department of Robotics, Mohamed Bin Zayed University of Artificial
Intelligence (MBZUAI) in Abu Dhabi, UAE. C. Hu, J. Johnson, and M.
Bagavathiannan are with Department of Soil and Crop Sciences, Texas
A&M University. H. Ji is with Boston Dynamics. Corresponding author:
Dezhen Song. Email: dezhen.song@mbzuai.ac.ae.
Propane
Tank
Grayscale-D
Camera
RGB-D
Camera
Thermal
Camera 2
cross-wind
&
Lighter
Torch On-Board
Computer
Thermal
Camera 1
Fig. 1. Robotic weed flaming system design and main components.
factors such as gravity, wind, atmospheric pressure, fuel tank
pressure, and pose of the nozzle. Using the images from two
onboard infrared cameras and the pose information of the
flamethrower nozzle on a mobile manipulator, we propose a
new dynamic flame coverage model. The flame model uses
a center-arc curve with a Gaussian cross-section model to
describe the flame coverage in real time.
We have implemented our system and algorithm and
performed physical experiments in the field. Our results
show that the new design of the robotic flaming system
is effective and that our flame estimation algorithm can
provide a satisfactory prediction of the flame coverage. The
experiments have shown that our model and algorithm can
achieve a mean average precision (mAP) of more than 76%
in the re-projected images in real-time prediction.
II. REL ATED WORK
This paper is closely related to robotic weed control and
flame estimation.
Robotic Weed Control: While agricultural robotics
presents new challenges to researchers [1], [2], the topics
of scene perception [3]–[6], autonomous navigation [7], mo-
tion planning [8], and manipulation [9]–[11] in agricultural
environment have been extensively explored.
For robotic weed control, existing researches have covered
weed detection [3] and various control mechanisms, such as
cultivation [12], stamping [13], mowing [14], and precise
application of herbicides [6], [15]. As an alternative to
these mechanical or chemical weed control methods, weed
flaming is contact-free and has been shown to be effective on
herbicide resistant weed species while capable of preserving
plant residues to reduce water erosion and leaving no chem-
ical residues [16]. The non-selective characteristic of weed
arXiv:2407.04929v1 [cs.RO] 6 Jul 2024
flaming makes it applicable to various kinds of weeds, but
it also poses a challenge on precise flame control to avoid
collateral damage to the field. More specifically, flaming
should desiccate the weed with minimal disturbance to crops
or soil microorganisms [?], [16]. Studies on manual weed
flaming have explored feasibility [17], effectiveness [18],
best application conditions [16], impact of burning angles,
burner designs and shielding methods on weed flaming [19].
Due to the high labor cost and the lack of precise flame
control methods, the application of manual weed flaming is
limited. Meanwhile, robotic weed flaming has not been well
studied, which presents unique problems for us to explore.
Flame Estimation: Robotic weed flaming poses a unique
challenge on modeling the flame as a soft end-effector in
dynamic cross-wind environment, which is unprecedented in
other robotic manipulation applications. Rigid end-effectors
are commonly used in object grasping, crop harvesting and
manufacturing [9], [20], and their behaviors can be modeled
by forward kinematics [21]. As for soft end-effectors, they
are typically modeled based on their individual physical
properties. Notable examples that use gas or liquid flows
as soft end-effectors include air levitation conveyors [22],
[23], surgical water jets [24] and herbicide/paint sprayers [6],
[25]–[27]. Similar to these soft end-effectors, the flame for
weed removal needs a model that can capture its deformation
caused by the outdoor environment and the fluctuation in gas
fuel pressure. In this paper we propose a data driven model to
estimate the flame direction and coverage without explicitly
deriving the complex physical model since it is not necessary
for our application.
The combustion process of flame has been studied ex-
tensively in the field of energy engineering. Modeling and
estimating the flame direction and coverage for weed re-
moval is closely related to the topics of flame tomography
reconstruction and flame radiation estimation.
Flame tomography reconstruction aims to recover the
flame temperature distribution using the computed tomogra-
phy technique. By observing the flame emission or absorp-
tion spectrum from multiple perspectives [28], [29], the flame
temperature distribution is reconstructed offline as volumetric
grids. Flame tomography reconstruction can achieve superior
spatial and temporal resolution by taking measurements at
regular intervals using camera/photodiode arrays [30], [31].
In [30], the authors reconstructed a 70×70 ×105 mm3flame
volume at 5kHz using six cameras and two workstations
with over 200GB of RAM. Although flame tomography
reconstruction is ideal for detailed profiling of the indoor
flaming process, its complicated setup and computational
cost make it impractical in field applications.
Flame radiation estimation focuses on predicting the trans-
fer of flame energy in outdoor environments. Semi-empirical
models that capture flame characteristics, such as width
and center line length, are widely adopted to avoid the
complexity of establishing an accurate physical model of
outdoor flame radiation [32]. Wang et al. [33] proposed a
2D flame length prediction model for propane jet fires in a
crosswind environment based on the circular arc flame center
line assumption. Zhou et al. [34] presented a line source
radiation model and measured three different flame shapes
to predict the flame heat flux profile. These semi-empirical
models provide valuable insights of the flame characteristics
in outdoor environments, but their focus on large scale flame
width and center line estimation in 2D makes them unsuitable
for 3D flame coverage estimation in fields.
Building on the existing methods, we are developing a
3D flame surface model that requires measurements from as
little as two camera perspectives and can be used in real-time
field applications.
III. SYS TEM DESIGN
We begin with the overall design of the hardware system
before introducing the software diagram.
A. Hardware System
The system uses the Spot Mini™ quadruped robot from
Boston Dynamics™ as the moving platform with a light
weight 6-DoF Unitree™ Z1 manipulator/arm mounted on
the back. The arm has a payload capacity of 2kg. The
Spot and the arm are both powered by the Spot’s internal
battery, which provides 58.8V direct current (DC). A step-
down DC-DC converter is used to provide power for the arm
and on-board computer. An Intel Realsense™ D435 RGB-
D camera is mounted on the manipulator end-effector for
weed detection and localization. Two FLIR™ Lepton 3.5
thermal cameras are mounted on the end-effector and robot
body to monitor the flame direction and coverage. The arm,
thermal cameras and RGB-D camera form a hand-eye vision
system, as this configuration allows the system to obtain
observations of the weed scene and torch flame at different
angles and distances. All perception, decision-making, and
control algorithms are executed on an on-board computer
which is the Spot Core from Boston Dynamics™ . To provide
the flaming capability, a propane tank is mounted on the
Spot, and the propane gas is delivered to the torch mounted
at the tip of the arm through a torch control unit. The unit
receives control commands from the onboard computer and
turns on/off the gas flow when the torch is at the desired
actuation position. A relay-controlled lighter mounted next
to the torch is used to ignite the fuel.
Z
Y
X
Fig. 2. Illustration of the robot coordinate systems with color-coded axes.
The coordinate systems of our weed removal robot are
shown in Fig. 2. The two thermal camera frames {C},{C′}
and the frame of the RGB-D camera which is mounted
on the end-effector {CE}are calibrated to the manipulator
frame {M}and the end-effector frame {E}using hand-
eye calibration. The transformation between the torch nozzle
head {H}and the cameras {C′}and {CE}is established
using the extrinsic calibration of the camera with a custom
designed checkerboard mounted on the nozzle head. The
Spot robot body frame {B}and the manipulator frame {M}
are calibrated using the hand-eye calibration results and
extrinsic camera calibration between the end effector RGB-D
camera and the robot Grayscale-D camera.
Coarse
Weed Detection
Weed Center
Detection
Online
Flame Estimation
Spot Robot
Planning
Robot Manipulator
Planning
Spot Robot
Pose RGB-D Images End Effector
Pose
End Effector PoseSpot Robot Pose
Thermal Images
Weed Position Weed Center Flame Surface
Fig. 3. Software diagram. The software system is partitioned into two
components with the shaded blocks control Spot body motion and the rest
directs the manipulator motion for weed flaming.
B. Software Design
Fig. 3 illustrates the software pipeline for our robotic weed
flaming system. The data processing procedure begins with a
preliminary coarse localization of the weed, which uses the
weed detection model trained within the YOLOv6 framework
[35]. Subsequently, the Spot robot planning component gen-
erates a trajectory, using the Spot software development kit
(SDK), thus guiding the Spot robot from its initial state to
a position in close proximity to the identified weed. This
step ensures that the weed is within the region that allows
the manipulator to perform the flaming operation (the gray
box in Fig. 3). More details on the planning algorithm
can be found in [?]. After Spot arrives at the planned
position, we perform weed center detection using the close-
up view from the RGB-D camera. Meanwhile, the online
flame estimation is performed to estimate the temperature
thresholded flame surface for precise manipulation. As the
last step, the manipulation planning finds the state for the
flaming pose of the end effector using the combined position
of the center of the weed and the flame surface. The motion
plan of the arm is computed using the Unitree™ arm SDK.
The system is an integration of many existing develop-
ments with an important algorithmic challenge which is
online flame estimation. Unlike fixed end-effectors in other
robotic manipulation tasks, the fire-flame front is not a rigid
body, and understanding its coverage in real time determines
the success of the weed-flaming task. In the following, we
will explain how we address this problem.
(a)
(b)
w
l
(c)
Fig. 4. (a) Illustration of flame projection. The projected thermal images
in the first and second views are shown in the short and long red boxes,
respectively. Using the center line (black dashed line), each backprojected
ray from {C}(blue dotted lines) determines a flame surface point (red stars),
while the depth ambiguity in rays from {C′}(red dotted lines) blocks the
surface point estimation. (b) Illustration of the flame center arc model. (c)
Illustration of the flame cross-section width function w(l). The red line
shows the fitted w(l)and the black dots are the measurements
IV. PROBLEM FORMULATION
The online flame coverage estimation problem takes im-
ages from two thermal cameras to estimate flame coverage.
As shown in Fig. 1, thermal camera 1 is installed in front of
the Spot body and provides a side view of the flame. Thermal
camera 2 is installed on the end effector of the arm.
A. Assumption
To allow us to focus on the key problem, we have the
following assumptions.
a.1 The flame cross-section temperature distribution is a 2D
isotropic Gaussian distribution.
a.2 The initial flame direction is the same as the torch
direction.
a.3 The flame radiates uniformly over its boundary surface
[36], [37].
a.4 All cameras and the manipulator have been calibrated
and hence the intrinsics of the cameras and relative
poses among frames in Fig. 2 are known.
a.5 The ambient thermal radiation is constant.
B. Nomenclature
{H}is the 3D frame of the torch nozzle head. Its origin is
at the center of the nozzle head. Its Z-axis is parallel to
the direction of the nozzle. All variables are defaulted
to {H}.
{C}is the 3D frame of the first thermal camera. Its origin is
at the camera optical center, its Z-axis coincides with
the optical axis and pointing to the forward direction of
the camera. Similarly, we define {C′}as the 3D frame
for the second thermal camera.
˜x,˜x′are points in the first and second views, ˜x,˜x′∈R2,
respectively. The homogeneous counterparts of ˜x is
x:= [˜xT1]T∈P2, and x′:= [˜x′T1]T∈P2.
˜
Xis a point in the 3D Euclidean space, ˜
X∈R3. Its
homogeneous counterpart is X = [ ˜
XT1]T∈P3.
F(X) is the flame surface function, F(X) = 0 if and only if
Xis on the surface.
By convention, the prime symbol ′denotes variables in the
second thermal camera view. Let us define our problem.
C. Problem Definition
Definition 1 (Flame Estimation): Given a temperature
threshold T, thermal images Iand I′from two different
perspectives, and the two-view camera poses C
HTand C′
HT,
estimate the temperature thresholded flame boundray surface
function F(X).
V. ALGORITHM
A. Flame Modeling
We model the flame with a center curve associated with
a circular cross-section based on the 2D isotropic Gaussian
distribution cross section temperature assumption. Let fcbe
the flame center curve function such that C={XC|fc(XC) =
0}are all the points on the curve. The flame cross-section
at point XCis a circle centered at XCand perpendicular
to the flame center curve. For an arbitrary point X, its
closest corresponding point on the center curve is X(c)=
arg min
Xi∈C
∥X−Xi∥2, where ∥·∥2is the L2norm. Based on
this flame center curve fc, the temperature thresholded flame
surface function is modeled as
F(X) = ∥X−X(c)∥2−w(lX(c))(1)
where w(lX(c))is the frame cross-section width function
that encodes the radius of the cross-section boundary circle
centered at X(c), and lX(c)is the curve length from nozzle
position XHto X(c). As shown in Fig. 4(a), the distance
between Xand the center curve is equal to the cross section
width (red line) only when Xis a point on the cross-section
boundary circle. Therefore F(X) = 0 defines the boundary
surface of the flame.
To model the flame center curve, we extend the ap-
proximation proposed in [33] and use a 3D circular arc
representation. As shown in Fig. 4(b), because the flame
starts from the torch nozzle and its initial direction is the
same as the torch direction, the tangent direction of the
circular arc at the position of the nozzle XHis parallel to
the Z axis of the torch frame, which means that the center
of the circular arc XOmust lie in the X-Y plane of the torch
frame. Intuitively, this circular arc can be represented by its
radius r, central angle αand rotation angle βalong Z axis
of the torch frame. Here we use the circular arc midpoint
XMas an alternative representation for simplicity. Because
XMis a point on the arc, line XHX(c)
Mformed by XHand
the X-Y plane projection of XMmust pass through XO.
The intersection of the perpendicular bisector of XHXM
and XHX(c)
Mis the arc center XO. While deriving XOfrom
XMresolves radius rand rotation angle β, central angle α
can be recovered from the fact that ∠XHXMX(c)
Mis equal
to α
4. Therefore, the flame center circular arc parameters
can be fully represented by the circular arc midpoint XM.
We have α= 4 asin ∥[XM]1:2∥2
∥XM∥2,β= atan2([XM]2,[XM]1)
and r=∥XM∥2
2/sin α
4, where [XM]1:2 is the first two
components of XM. Using the midpoint representation XM
also enhances the robustness during parameter estimation.
When the flame center curve approaches a straight line, the
estimated midpoint will approach the Z axis of the torch
frame, which avoids the numerical instability caused by the
circular arc radius approaching infinity.
In the following part of this section, we will show the
estimation process of the flame center circular arc and the
cross-section width function.
B. Flame Center Curve Estimation
Because the flame center arc parameter XMis estimated
from points in the thermal images, let us first introduce
the camera projection model. Based on the pinhole camera
model [38], a 3D point Xand its counterpart xin the camera
image satisfy the following.
x=λK[R t]X.(2)
Here λis a scaling factor, Kis the intrinsic matrix of
the camera, Rand tare the components of rotation and
translation of the transformation matrix C
HT=R t
01,
which transform from frame {H}to frame {C}.C
HTis
obtained from the forward kinematics of the manipulator and
the extrinsic calibration of the thermal cameras and the torch.
Similarly, we can define x′,λ′,K′,R′, and t′for the second
view, which follow the same pinhole model described in (2).
By color thresholding the thermal image from the first
view camera, we extract the center point xcof the thresh-
olded area. This center point is a projection of the circular
arc midpoint XM, therefore they satisfy
[xc]×K[R t]XM=0.(3)
Here [·]×is the skew-symmetric matrix. For the second view,
(3) can be rewritten by replacing xc,K,Rand twith their
counterparts. The flame center arc midpoint XMis solved
from the two view stacked version of (3), AXM=0, where
A=[xc]×K[R t]
[x′
c]×K′[R′t′].
Based on the estimate midpoint XM, the flame center
curve point set Cis assembled with points XCderived from
the estimated midpoint XMas
XC=RZ(β) XO
01
rcos a
0
rsin a
1
, a ∈[0, α].(4)
where RZreturns the Z-axis rotation matrix, XO=
RZ(β)[r0 0]T,α,βand rare circle arc parameters derived
in Sec. V-A.
C. Flame Cross-Section Width Estimation
Similar to the classical 3D shape-from-silhouette 3D vol-
umetric reconstruction techniques [39], [40], we use the
thresholded boundary points in image to first estimate the
3D flame surface points, then calculate the width of their
corresponding cross-sections and use Gaussian process (GP)
regression to model the continuous cross-section width func-
tion.
Flame Surface Point Estimation: We recover the flame
surface points based on thresholded boundary points in two
views and the estimated flame center curve.
If a flame surface point projects to thresholded boundary
points in both views, it can be recovered from the back-
projected rays of these boundary points. However, due to the
pixelization error of cameras, the back-projected rays might
not intersect with each other. We set a distance threshold dto
resolve this issue, we register the closest point to both rays
as a surface point when the distance between the two rays
is less than d, which is analogous to the voxel size in the
volumetric methods. The distance between rays is also used
to identify the correspondences between the boundary points
in two views by iterating through all their combinations in
the order of their distance to the epipolar line. Let xjbe
the j-th point on the thresholded boundary in the first view
image, and x′
kbe the k-th boundary point in the second
view. Based on (2), the direction vector of the back-projected
ray from xjis vj=RTK−1xjand the ray point set is
Lj={Xj|˜
Xj=λvj−RTt;λ∈R}. Similarly, v′
kand L′
k
can be derived from x′
k. Given these two back-projected rays
Ljand L′
k, their closest points to each other are ˆ
Xj,ˆ
X′
k=
arg minXj∈Lj
X′
k∈L′
k
∥Xj−X′
k∥2. And their acute angle between
each other is θj,k = acos(|vT
jv
′
k
|vj||v′
k||). A flame surface point
is registered to the surface point set Sas
Xs=ˆ
Xj+ˆ
X′
k
2∈ S,if (∥ˆ
Xj−ˆ
X′
k∥2< d)∧(θj,k > θ).(5)
Here didentifies their correspondence, and θdetermines the
acceptable uncertainty range of this estimation, a smaller θ
means the acceptable uncertainty range is larger.
If a flame surface point projects to thresholded boundary
points in only one view, it can be recovered from the flame
center curve and the back-projected ray of this boundary
point. Due to depth ambiguity, this surface point can be
uniquely identified only when the back-projected ray is per-
pendicular to the center curve (blue dotted line in Fig. 4(a)),
otherwise the surface points on the back-projected ray are
indistinguishable (points on red dotted line in Fig. 4(a)),
potentially leading to a wrong cross-section width estimation
(black line in Fig. 4(a)). Let the back-projected ray of the
j-th boundary point in the first view be Lj, its direction
vector is vj, and the gradient direction of the center curve
Cat point ˆ
Xcis vc. The closest points between them are
ˆ
Xj,ˆ
Xc= arg min Xc∈C
Xj∈Lj
∥Xj−Xc∥2. A flame surface point
is registered to the surface point set Sas
Xs=ˆ
Xj∈ S,if vT
cvj= 0.(6)
For boundary points in the second view, their corresponding
surface points can be derived in the same way. The perpen-
dicular condition can be relaxed to acos(|vT
jvc
|vj||vc||)> γ when
certain amount of depth ambiguity is acceptable.
Cross-Section Width Function Estimation: With the
estimated flame surface points Xs∈ S, we first obtain
their closest point X(c)
sto the center curve in the same
way as (1), then calculate their corresponding cross-section
widths ws=∥Xs−X(c)
s∥2and circular arc length ls=
2rasin(∥X(c)
s−XH∥2
2r). Using the training data, we fit the
cross-section width prediction function w(l)using GP re-
gression as
w(l)∼ GP kT
∗(K + σ2
NI)−1w, k∗∗ −kT
∗(K + σ2
NI)−1k∗(7)
where w= [w1. . . wN]Tare the training cross-section
widths, σNis the observation variance. Here K∈RN×N,
k∗∈RN×1and k∗∗ ∈Rare the covariances of the training-
training, training-query, and query-query points, respectively.
Their individual component K[m, n] = cov(lm, ln), and here
cov(·,·)is the thin-plate function [41]. Fig. 4(c) shows an
example of the fitted w(l)and the measurements.
D. Joint Optimization
With the estimated flame center curve and cross-section
width, we first evaluate the re-projected flame silhouett in
two views, then refine the estimation based on the intersec-
tion over union (IoU) cost.
Let Ibe the set of silhouette points in the first view
obtained through thresholding the original thermal image,
and I′is its counter part in the second view. The set of
re-projected silhouette points in the first view is
Ir={xj| ∃Xj∈ Lj,F(Xj)≤0}.(8)
And let I′
rbe its counter part in the second view. We jointly
optimized the flame center arc midpoint and the estimated
flame widths X= [XT
MwT]Tusing the IoU as
max
X
|I ∩ Ir|
|I ∪ Ir|+|I ′∩ I′
r|
|I′∪ I ′
r|.(9)
VI. EXP ERI MENTS
We have constructed the system and tested the entire
system and the online flame estimation algorithm under field
conditions. First, let us show the flame estimation results.
A. Flame Estimation Validation
To validate the proposed online flame estimation algo-
rithm, we have collected a dataset with 156 pairs of im-
ages from the thermal cameras in an indoor environment
where the speed and direction of the wind can be manually
controlled. We manually classified the collected images as
being affected by light/strong wind and used the procedure
described in Sec. V estimate the circular arc flame model.
Additionally, we also estimated the flame based on a straight
line model to server as a baseline. We can not compare our
method with the conventional flame reconstruction methods
Original Images Reprojected Images
(a)
Original Images Reprojected Images
(b)
(c)
(d)
Fig. 5. (a) and (b) show the original thermal images and the reprojected
images under light wind and strong wind, respectively. The red area is from
the circular arc model, the white area is the thresholded region and the bule
outline is from the straight line model. (c) and (d) show the corresponding
reconstructed 3D flame using the circular arc model.
[28], [30] because they require images from multiple side-
view cameras, which are not avaliable on the weed flaming
robot. The accuracy of the estimations is evaluated by the
mean average precision (mAP) of the re-projected images
of the reconstructed 3D flame model with respect to the
temperature thresholded thermal images.
TABLE I
FLA ME ESTIMATION MEA N AVE RAGE PRECISION
Light Wind Strong Wind Overall
Straight Line 75.8% 68.6% 71.5%
Circular Arc 79.7% 74.5% 76.6%
As shown in Tab. I, the circular arc model outperforms the
baseline straight line model in both light wind and strong
wind conditions. Fig. 5 shows the example original and
reprojected images under light/strong wind. The circular arc
model and the straight line model produce similar results
under light wind (Fig. 5(a)), but the performance of the
straight line model degrades under strong wind while the
circular arc model maintains its performance (Fig. 5(b)).
More results from the light-wind and strong-wind datasets
are shown in Fig. 6 and the multimedia attachment.
B. Weed Flaming System Validation
We conducted physical experiments on a raised bed plot
and a cotton field to validate our flaming system against
different weeds. We present quantitative results of the raised
bed plot experiment in this section and qualitative results of
the cotton field experiment in the multimedia attachment.
The experiment in the raised-bed plot focuses on quanti-
tatively testing the full system pipeline from weed detection,
robot and manipulator planning to flaming actuation. This
field is cleaned to provide a safe environment for the test.
We have conducted 5 trials in total with random initial robot
position in each trial. After the Spot robot arrives at the
planned position, the RGB-D camera images are recorded
to validate the detection-planning-actuation results. After the
(a) (b) (c) (d) (e)
(f) (g) (h) (i) (j)
Fig. 6. Typical results of (a)-(e) the light wind dataset and (f)-(j) the strong
wind dataset.
(a) (b) (c)
(d) (e)
Fig. 7. Results of the rised-bed plot experiment trial 1 to 5. The green
bounding boxes show the weed center detection results and the red areas
are the re-projected hot spots captured by the thermal camera after flaming.
flaming has been completed, the thermal images are recorded
for flaming coverage evaluation. Common weeds from south-
ern Texas, such as common sunflower, giant ragweed, and
smell melon, are identified and manually transplanted to the
raised-bed plot for the experiments.
The results of the five raised-bed plot experiment trials
are shown in Fig. 7. Due to the random initial positions,
the weed positions in the images are different. The images
from all five trials capture the weeds indicates that Spot
detection-planning-actuation stage is successful and the robot
has moved to region where the weed is reachable by the
arm for the flaming action. After running the weed center
detection and online flame estimation, the end-effector is
controlled to approach the weed center with the flame tip
based on its planned pose. The hot spots after flaming are
recorded by the thermal camera and re-projected to the RGB
images for evaluation. We measure the performance of the
weed flaming system using the precision of the hot spot areas
with respect to the weed center detection bounding boxes and
the offset between the detected weed centers and the hot spot
centers. Tab. II shows the system precision and the flaming
center offset in each trial and the result is satisfactory. The
averaged precision indicates 94.4% of the flammed area are
within the detected weed bounding box, and the averaged
center offset is 6.71 cm.
TABLE II
RIS ED-BE D PLOT EXPE RIME NT RESU LTS
Trial Index Precision Center Offset (cm)
1 96.7% 10.49
2 92.5% 5.72
3 100% 7.47
4 100% 2.54
5 82.6% 7.32
Avg. 94.4% 6.71
The experiment in the cotton field aims to evaluate the
performance of the weed flaming system in a real agriculture
field. The experiment setup is shown in Fig. 1, and the results
are included in the multimedia attachment.
VII. CONCLUSIONS AND FUTURE WORK
We developed a robotic weed-flaming solution using a 6
DoF manipulator mounted on a quadrupedal robot. We pre-
sented the overall design, hardware integration, and software
pipeline of the mobile manipulator system. We proposed a
flame model with an estimation method that uses a center
arc curve with a Gaussian cross-section model to describe
the flame coverage in real time. The experiments have shown
that our system and algorithm design have been successful.
In the future, we will plan the motion of the mobile
manipulator to achieve dynamic flame coverage of multiple
weeds. New weed removal techniques, such as electrocution,
can also be integrated into the robot. New algorithms will be
developed to enable the new system and improve efficiency.
ACKNOWLEDGMENT
We are grateful to Aaron Kingery and Fengzhi Guo for their
inputs and contributions.
REFERENCES
[1] J. Billingsley, A. Visala, and M. Dunn, “Robotics in agriculture
and forestry,” Springer Handbook of Robotics, pp. 1065–1077, 2008.
[Online]. Available: https://doi.org/10.1007/978-3-540-30301-5 47
[2] S. G. Vougioukas, “Agricultural robotics,” Annual Review of
Control, Robotics, and Autonomous Systems, vol. 2, no. 1,
pp. 365–392, 2019. [Online]. Available: https://doi.org/10.1146/
annurev-control-053018-023617
[3] S. Xie, C. Hu, M. Bagavathiannan, and D. Song, “Toward robotic
weed control: Detection of nutsedge weed in bermudagrass turf using
inaccurate and insufficient training data,” IEEE Robot. Autom. Lett.,
vol. 6, no. 4, pp. 7365–7372, 2021.
[4] G. V. Nardari, A. Cohen, S. W. Chen, X. Liu, V. Arcot, R. A. F.
Romero, and V. Kumar, “Place recognition in forests with urquhart
tessellations,” IEEE Robot. Autom. Lett., vol. 6, no. 2, pp. 279–286,
2021.
[5] E. Marks, F. Magistri, and C. Stachniss, “Precise 3d reconstruction of
plants from uav imagery combining bundle adjustment and template
matching,” in Proc. IEEE Int. Conf. Robot. Autom. (ICRA), Philadel-
phia, May 2022, pp. 2259–2265.
[6] C. Hu, S. Xie, D. Song, J. A. Thomasson, R. G. H. IV, and
M. Bagavathiannan, “Algorithm and system development for robotic
micro-volume herbicide spray towards precision weed management,”
IEEE Robotics and Automation Letters, vol. 7, no. 4, pp. 11 633–
11 640, 2022.
[7] A. N. Sivakumar, S. Modi, M. V. Gasparino, C. Ellis, A. E. Baquero
Velasquez, G. Chowdhary, and S. Gupta, “Learned visual navigation
for under-canopy agricultural robots,” in Proc. Robot.: Sci. Syst.,
Virtual, July 2021.
[8] P. Maini, B. M. Gonultas, and V. Isler, “Online coverage planning for
an autonomous weed mowing robot with curvature constraints,” IEEE
Robot. Autom. Lett., vol. 7, no. 2, pp. 5445–5452, 2022.
[9] J. Davidson, S. Bhusal, C. Mo, M. Karkee, and Q. Zhang, “Robotic
manipulation for specialty crop harvesting: A review of manipulator
and end-effector technologies,” Global Journal of Agricultural and
Allied Sciences, vol. 2, no. 1, pp. 25–41, 2020. [Online]. Available:
https://doi.org/10.35251/gjaas.2020.004
[10] E. Vrochidou, V. N. Tsakalidou, I. Kalathas, T. Gkrimpizis,
T. Pachidis, and V. G. Kaburlasos, “An overview of end effectors
in agricultural robotic harvesting systems,” Agriculture, vol. 12,
no. 8, 2022. [Online]. Available: https://www.mdpi.com/2077-0472/
12/8/1240
[11] A. T. Meshram, A. V. Vanalkar, K. B. Kalambe, and A. M.
Badar, “Pesticide spraying robot for precision agriculture: A
categorical literature review and future trends,” Journal of Field
Robotics, vol. 39, no. 2, pp. 153–171, 2022. [Online]. Available:
https://onlinelibrary.wiley.com/doi/abs/10.1002/rob.22043
[12] C. McCool, J. Beattie, J. Firn, C. Lehnert, J. Kulk, O. Bawden,
R. Russell, and T. Perez, “Efficacy of mechanical weeding tools:
A study into alternative weed management strategies enabled by
robotics,” IEEE Robot. Autom. Lett., vol. 3, no. 2, pp. 1184–1190,
Feb. 2018.
[13] A. Michaels, S. Haug, and A. Albert, “Vision-based high-speed ma-
nipulation for robotic ultra-precise weed control,” in Proc. IEEE/RSJ
Int. Conf. Intell. Robots Syst., Sep. 2015, pp. 5498–5505.
[14] C. Melita, G. Muscato, and M. Poncelet, “A simulation environ-
ment for an augmented global navigation satellite system assisted
autonomous robotic lawn-mower,” J. Intell. Robot. Syst., vol. 71, p.
127–142, Aug. 2013.
[15] W. S. Lee, D. Slaughter, and D. Giles, “Robotic weed control system
for tomatos,” Precis. Agric., vol. 1, pp. 95–113, Jan. 1999.
[16] M. V. Bauer, C. Marx, F. V. Bauer, D. M. Flury, T. Ripken, and
B. Streit, “Thermal weed control technologies for conservation agri-
culture—a review,” Weed Research, vol. 60, no. 4, pp. 241–250, 2020.
[17] P. ˇ
Sniauka and A. Pocius, “Thermal weed control in strawberry,”
Agronomy Research, vol. 6, pp. 359–366, 2008.
[18] ASCARD, “Comparison of flaming and infrared radiation techniques
for thermal weed control,” Weed Research, vol. 38, no. 1, pp. 69–76,
1998.
[19] K. J. Storeheier, “Basic investigations into flaming for weed control,”
in Symposium on Engineering as a Tool to reduce Pesticide Consump-
tion and Operator Hazards in Horticulture 372, 1993, pp. 195–204.
[20] V. Babin and C. Gosselin, “Mechanisms for robotic grasping and
manipulation,” Annual Review of Control, Robotics, and Autonomous
Systems, vol. 4, no. 1, pp. 573–593, 2021. [Online]. Available:
https://doi.org/10.1146/annurev-control-061520- 010405
[21] M. T. Mason, Mechanics of Robotic Manipulation. The MIT Press,
06 2001. [Online]. Available: https://doi.org/10.7551/mitpress/4527.
001.0001
[22] A. Delettre, G. J. Laurent, and N. Le Fort-Piat, “2-dof contactless
distributed manipulation using superposition of induced air flows,” in
2011 IEEE/RSJ International Conference on Intelligent Robots and
Systems, 2011, pp. 5121–5126.
[23] G. J. Laurent, A. Delettre, and N. Le Fort-Piat, “A new aerodynamic-
traction principle for handling products on an air cushion,” IEEE
Transactions on Robotics, vol. 27, no. 2, pp. 379–384, 2011.
[24] S. Morad, C. Ulbricht, P. Harkin, J. Chan, K. Parker, and
R. Vaidyanathan, “Modelling and control of a water jet cutting probe
for flexible surgical robot,” in 2015 IEEE International Conference on
Automation Science and Engineering (CASE), 2015, pp. 1159–1164.
[25] V. Potkonjak, G. S. Dordevic, D. Kostic, and M. Rasic, “Dynamics of
anthropomorphic painting robot: Quality analysis and cost reduction,”
Robotics and Autonomous Systems, vol. 32, no. 1, pp. 17–38, 2000.
[Online]. Available: https://www.sciencedirect.com/science/article/pii/
S0921889099001268
[26] P. J. From, J. Gunnar, and J. T. Gravdahl, “Optimal paint gun
orientation in spray paint applications—experimental results,” IEEE
Transactions on Automation Science and Engineering, vol. 8, no. 2,
pp. 438–442, 2011.
[27] Y. Chen, W. Chen, B. Li, G. Zhang, and W. Zhang, “Paint thickness
simulation for painting robot trajectory planning: a review,” Industrial
Robot: An International Journal, vol. 44, no. 5, pp. 629–638, Jan
2017. [Online]. Available: https://doi.org/10.1108/IR-07-2016-0205
[28] Y. Jin and G. Situ, “A survey for 3d flame chemiluminescence tomog-
raphy: Theory, algorithms, and applications,” Frontiers in Photonics,
vol. 3, 2022.
[29] C. Liu and L. Xu, “Laser absorption spectroscopy for combustion
diagnosis in reactive flows: A review,” Applied Spectroscopy Reviews,
vol. 54, no. 1, pp. 1–44, 2019.
[30] L. Ma, Y. Wu, Q. Lei, W. Xu, and C. D. Carter, “3d flame topography
and curvature measurements at 5 khz on a premixed turbulent bunsen
flame,” Combustion and Flame, vol. 166, pp. 66–75, 2016.
[31] W. Cai and C. F. Kaminski, “Tomographic absorption spectroscopy
for the study of gas dynamics and reactive flows,” Progress in Energy
and Combustion Science, vol. 59, pp. 1–31, 2017.
[32] N. S. Ab Aziz, R. Md. Kasmani, M. D. M. Samsudin, and A. Ahmad,
“Comparative analysis on semi-empirical models of jet fire for radiant
heat estimation,” Process Integration and Optimization for Sustain-
ability, vol. 3, no. 3, pp. 285–305, Sep 2019.
[33] Q. Wang, A. Lu, L. Chang, and B. Wang, “A new mathematical method
for quantifying flame trajectory length of inclined buoyant jet fires in
crosswind,” Journal of Thermal Analysis and Calorimetry, vol. 147,
no. 20, pp. 11 363–11 371, Oct 2022.
[34] K. Zhou and J. Jiang, “Thermal radiation from vertical turbulent jet
flame: Line source model,” Journal of Heat Transfer, vol. 138, 12
2015.
[35] C. Li, L. Li, H. Jiang, K. Weng, Y. Geng, L. Li, Z. Ke, Q. Li,
M. Cheng, W. Nie, Y. Li, B. Zhang, Y. Liang, L. Zhou, X. Xu,
X. Chu, X. Wei, and X. Wei, “Yolov6: A single-stage object detection
framework for industrial applications,” 2022.
[36] K. S. Mudan, “Thermal radiation hazards from hydrocarbon pool
fires,” Progress in Energy and Combustion Science, vol. 10, no. 1,
pp. 59–80, 1984.
[37] A. Luketa, “Model evaluation protocol for fire models involving fuels
at liquefied natural gas facilities.” SANDIA REPORT, no. SAND2022-
6812, 5 2022.
[38] R. Hartley and A. Zisserman, Multiple View Geometry in Computer
Vision, 2nd ed. Cambridge University Press, 2004.
[39] W. N. Martin and J. K. Aggarwal, “Volumetric descriptions of objects
from multiple views,” IEEE Transactions on Pattern Analysis and
Machine Intelligence, vol. PAMI-5, no. 2, pp. 150–158, 1983.
[40] K. Kutulakos, “Shape from the light field boundary,” in Proceedings of
IEEE Computer Society Conference on Computer Vision and Pattern
Recognition, 1997, pp. 53–59.
[41] O. Williams and A. Fitzgibbon, “Gaussian process implicit surfaces,”
in Gaussian Processes in Practice, April 2007.