Conference PaperPDF Available



Abstract and Figures

This paper describes the REXUS/BEXUS experiment for Quad-spectral Unaided Experimental Scanner of Topography (QUEST). QUEST as part of the REXUS/BEXUS program was designed, developed, built, tested and operated by a team of 17 students from different German universities. It scanned the planet surface by analyzing an array of four light sensors (RGB and IR) and two spectrometers. A reusable cluster algorithm determined autonomously onboard an overview image of the surface with marked areas depending on the type of the surface. Furthermore, the algorithm's data base was generated and optimized before and during flight. Regarding the hardware a modular sensor framework with standardized interfaces was implemented. The project has been a successful step to the designated target to build an autonomous system which could be used in interplanetary missions with demanding constraints on the bandwidth.
Content may be subject to copyright.
Essen, Germany
16-20 June 2019
Ric Dengel1, Nico Florsch¨
utz1, Valentin Huber1, Tim Muller1, Jan von Pichowski1, Alexander Rabinowitsch1,
Sebastian Scholz1, Hans Sch¨
ulein1, Eike Steinweg1, Benjamin Stippel1, Peter St¨
oferle1, Isabell Wittekind1, Oliver
Wizemann1, Alexander Zaft1, Lukas Zembrot1, Dominik Woiwode2, and Katrin Griebenow3
1Institut f¨
ur Informatik – Lehrstuhl VIII, Julius Maximilian University of W¨
urzburg, Germany,
2Gottfried Wilhelm Leibniz University Hanover, Germany
3Johannes Gutenberg University Mainz, Germany
This paper describes the REXUS/BEXUS experiment for
Quad-spectral Unaided Experimental Scanner of Topog-
raphy (QUEST). QUEST as part of the REXUS/BEXUS
program was designed, developed, built, tested and op-
erated by a team of 17 students from different German
universities. It scanned the planet surface by analyzing
an array of four light sensors (RGB and IR) and two
spectrometers. A reusable cluster algorithm determined
autonomously onboard an overview image of the sur-
face with marked areas depending on the type of the sur-
face. Furthermore, the algorithm’s data base was gener-
ated and optimized before and during flight. Regarding
the hardware a modular sensor framework with standard-
ized interfaces was implemented. The project has been
a successful step to the designated target to build an au-
tonomous system which could be used in interplanetary
missions with demanding constraints on the bandwidth.
Key words: Autonomous, Topography Scanner,
A key feature of interplanetary satellite missions is the
analysis of the planet surface. Therefor many different
sensors are used to get a detailed overview. Neverthe-
less, the bandwidth for transferring the measurements is
limited in a satellite mission. Thus, autonomous evalu-
ation and selection of valuable information on board of
the satellite is required to reduce the resulting data size.
There are several kinds of observations made by satel-
lites from the surface of a planet. An interesting research
area is the categorization of distinct areas on the surface.
However, new algorithms to perform this kind of research
are rare. The overall purpose of this experiment is to
contribute to landscape categorization based on satellite
imaging with a focus on the requirements of interplane-
tary missions. These requirements are reflected in a mod-
ular system made for reusability and result in a testbed
for a newly designed algorithm for landscape distinction
based on cluster analysis of distinct sensor and camera
measurements. The Quad-spectral Unaided Experimen-
tal Scanner of Topography (QUEST) mission is based on
the following primary scientific objectives:
Development of a modular sensor framework
Taking pictures of the earth in different wavelengths
Differentiation of landscapes based on taken images
Onboard data analysis
2.1. Experiment Concept
To properly categorize the landscape the algorithm needs
a set of predefined data. Therefore, it is necessary to ob-
tain reference data with the given sensors and cameras
to calibrate the algorithm. This raw data were collected
in previous experiments on the ground, on a plane flying
over and below clouds, the ascent phase of the balloon
and while the final experiment on the balloon is running.
After the raw data is collected some areas are categorized
by hand. Therefor an easy to use calibration software is
produced and integrated in the ground station. This algo-
rithm does not restrict the number and kinds of sensors
and cameras. An ordinary RGB camera, an IR camera
and a spectrometer are used. Those sensors are giving us
the possibility to distinguish between snow, vegetation,
water, rocks and overlaying clouds. To keep the possi-
bility for late sensor switches an open modular sensor
framework is designed as shown in Fig. 1.
Every sensor is coupled to a pre-processing unit to pro-
vide clearly defined data on a common data bus for all
Figure 1. Experiment Concept
sensors. The data is requested by the main processor and
fed to the algorithm. The result is stored and periodically
send to the ground station. Additionally, data logging is
performed and stored for post experiment analysis of the
system behaviour.
2.2. Mechanics Design
The experiment is composed of two main units, shown
in Fig. 2; on the first hand the sensor box, which is lo-
cated outside of the gondola containing all the sensors,
cameras and the corresponding pre-processors. On the
other hand, the gondola box, which contains the power
distribution, the main processor and serves as the link be-
tween the Gondola and the experiment. The shifted sen-
sor box is necessary to provide an unobstructed field of
view downwards. The whole box has a dimension of 821
x 280 x 380mm and a mass of 8.5kg. The experiment is
not significantly optimized in aspect of volume or mass.
Of course light materials such as aluminum where used
for the main structure.
Since the thermal environment on a high-altitude bal-
loon mission is very harsh a proper thermal was imple-
mented. Due to a float altitude of 28km the system is
designed to withstand temperature down to -60C and a
pressure below 5mBar. This leads on the one hand to
reduced heat dissipation over air, leading to overheating
of power consuming components. On the other hand,
to the risk of freezing passive components. For this
purpose several heat sinks were mounted onto the pro-
cessors and power converter to prevent the system from
overheating. Furthermore, thermal insulation in form of
high-density polyethylene was integrated. Also a thermal
heater was integrated to prevent the spectrometer from
freezing, which is the most thermal sensitive part.
Figure 2. Mechanics Design
2.3. Electronics Design
The experiment unit consists of nine components, sepa-
rated in inner and outer components. On the outside there
is the sensor box including two near infrared cameras,
two visual light cameras and two mini spectrometers. The
cameras are paired with a pre-processing unit including a
GUMSTIX IceStorm board. For the cameras a redundant
design is used to collect more data and increase reliabil-
Inside the gondola there are the other electronic com-
ponents: a pre-processing boards for the spectrometers,
the Onboard Computer (OBC) and the Power Distribu-
tion Unit (PDU). The PDU provides for the whole sys-
tem which runs on two different voltage levels. It con-
verts the provided battery power to the necessary 3.3V
and 5V. The powering of the redundant pre-processors is
handled by two separate power converters. This prevents
the system from a single point of failure, since the sys-
tems is depending on the images. Furthermore, all boards
are mounted on a breakout board which provides a power
supply, a data transfer cable (D-Sub 9 standard) and a
House-Keeping System(HKS). The HKS collects differ-
ent data from several inputs, including a voltmeter, an
ampere-meter and a temperature sensor. This allows an
overview of the system during flight via the ground sta-
tion 4.1. As mentioned in the mechanical section 2.2 a
heater is attached to the spectrometer. It is controlled by
a modified HKS, which includes a power relay to switch
the heater on demand on and off.
In Fig. 3 you can see a concept overview of all compo-
nents including the cable connections. The PDU supplies
the power to each system via a dedicated connector. The
HKS are powered via the D-Sub 9 cables.
Figure 3. Electronics Component Overview
The OBC board provides all data connections via the D-
Sub 9 port and the sensors are connected via individ-
ual cables to the pre-processing board. The standardiza-
tion of power and data connectors provides a high level
modularity which allows to connect any sensor with pre-
processor to the OBC. The OBC is responsible for data
handling and processing system as well as running the
scientific algorithm described in section 3.
The communication in the system between the different
subsystems is split into three communication systems.
For these communications three different bus systems are
1. SPI: Data transfer from the pre-processors to the
2. I2C: Data logging and transmission of housekeeping
3. UART: Command line for controlling the system.
2.4. Software Design
The software is as designed modular which leads to a
high interchangeability of the different components for
example switching between different algorithm types.
Furthermore, the software is divided into three main parts
listed below and visualized in Fig. 4.
1. Sensors: Handling data acquisition, pre-processing
and transmission to OBC. For a consistent program-
ming effort and communication, a framework must
be designed.
2. Main-processor: Central part with need to design
an interface for an easy use of distinct algorithms.
3. On Board Data Handler: Responsible for tracking
data and communicating with the ground station. Is
not part of the framework, but essential for the ex-
periment as it is responsible for ground communica-
tion and data storage.
Figure 4. Software Design
In general, the design cans handle partly system failures.
This means in case of an error on a pre-processor the
main processor is able to work with the remaining pre-
processors. If the lost pre-processors recovers itself, the
data will be used again on the main processor. Further-
more, in case of a main processor failure each subsystems
stores its data locally which means on later post-flight
analysis the data can be recovered and analyzed.
Several modes were implemented in the framework:
Idle: System takes pictures and stores them. En-
tered in case the connection to ground is lost which
enables a fully autonomous system. No database up-
dates possible, since no database can be uploaded.
Used before launch.
One shot: Used as a response to single sensor re-
quest. After getting the request from ground the
system takes one picture of the requested sensor and
sends it to the ground station. Afterwards it switches
back to the idle mode. Used during ascent and de-
Continuous: Switching enabled by a signal from
the ground station. Data from the sensors is syn-
chronized and processed by the algorithm. Good
rated results are transmitted to the ground station.
Used in the floating phase.
The core feature of the project is the algorithm. In in-
terplanetary satellite missions one of the most important
subjects is the analysis of planet surfaces. Therefor many
different sensors are used to get a detailed overview. Nev-
ertheless, the bandwidth for transferring the measure-
ments is limited on a satellite mission. Thus, autonomous
evaluation and selection of valuable information on board
of the satellite is required to reduce the resulting data size.
Nevertheless, new algorithms to perform this kind of re-
search are rare. Therefore, a new algorithm was devel-
oped by the team. The goal was to reduce the amount
of data which has to be transmitted and still get detailed
knowledge of the surface structure. On earth it would
be the differentiation of landscapes, such as water, forest,
and acreage and fields. QUEST on BEXUS 27 was the
first test flight of this new surface analysis algorithm and
could be sketched as followed.
1. The inputs of the algorithm are clearly defined sen-
sor and camera measurements for a given region of
the planet surface.
2. These regions are split into small fragments for ex-
ample the pixels of an image sensor.
3. Then an n-dimensional cluster analysis is performed
which groups data points together which have nearly
the same coordinates and sensor values. nis com-
posed of the number of coordinate-axis and the num-
ber of distinct sensor measurements for that frag-
4. The algorithm calculates the mean and standard de-
viation for each type of sensor value for each cluster
in the next step.
5. These values are compared to the reference val-
ues which were prepared beforehand, and the most
likely category is assigned together with the accu-
racy to the single points of the cluster.
6. In the last step the clusters are written back in a re-
sult image.
As cluster algorithm is used the DBScan algorithm,
which works well on high dimensional data sets. See
[1] for more information on the algorithm and [2] for
more information on the algorithm in multidimensional
data sets.
4.1. Ground Support Unit
For controlling the experiment two tools were developed
called the ground support equipment. The Ground Sup-
port Unit (GSU) is one part and responsible for the com-
munication between the experiment and the ground. The
Main window has six different tabs. The functions of
these are listed below:
Server control: Managing the Transmission Proto-
col (TCP) and User Datagram Protocol (UDP) lis-
Connection: Visualizing status, history, and stability
of the connection.
Telecommand: Sending commands to the experi-
Telemetry: Listing of current telemetry data for the
main- and every pre-processor. Visualization of re-
ceived housekeeping data in configurable diagrams.
Algorithm: Adjustment of algorithm parameters and
upload of new database for the algorithm.
Results: View of downloaded images and results.
4.2. Scientific Support Unit
With the Scientific Support Unit (SSU), shown in Fig. 5,
the ground control team is able to select different clusters
in multiple sensor images and classify them with labels.
This manually classified data is used to create parame-
ters for the algorithm on board, as well as a database that
the algorithm uses for classification, which will be sent
to the experiment with the help of the GSU and telecom-
mands. The functionality of the SSU can be described as
followed. The SSU allows to overlay images from differ-
ent sensors of the same image set. A set contains images
from multiple sensors that were shot at the same time.
The opacity of each image of the set can be controlled in-
dividually. The images are shown behind a canvas, which
allows to manually draw clusters on top of the images
as well as configure the classification of the clusters by
drawing with colors for different features. Furthermore,
the cluster algorithm can be executed locally on the com-
puter by the SSU to identify clusters which can be as-
signed to a feature.
Figure 5. Scientific Support Unit
During the ascent phase raw images were collected and
sent to the ground to generate a database, which then was
uploaded. The raw images were analyzed with the SSU
to improve the database continuously. As expected, the
predicted features got more and more precise over time.
Furthermore, a new integration of a complete new feature
was possible during flight. This improved the results, but
the final classification at the end of the flight was still
improvable which was done during post flight analysis.
In general, all components worked as planned. The pre-
processors filtered the incoming data to get synchronized
and standardized pictures. The communication between
the subsystem worked almost fine and all data was stored
for later analysis. Unfortunately, the picture transmission
rate was decreased since an error with the compression
algorithm lead to higher data amounts to be sent over the
network. This decreased the amount of processed image
sets significantly.
5.1. Image Results
During the flight, a total of 54 raw RGB images and 54
raw infrared images were downloaded to the ground sta-
tion. These recorded images are undersaturated and have
a round border from the filters of the cameras. The RGB
cameras also had a slight offset to the infrared cameras,
resulting in a translated image. The original images are
shown in Fig. 6 and Fig. 7. As expected, the first in
flight result images were not satisfying as example Fig. 8
shows. The clustering yielded clusters that were too large
and the classification was done incorrectly.
Updates to the parameters of the clustering algorithm as
well as updates on classification database improved the
quality of the results. This led to a great improvement
towards the end of the flight. Still at the end of the flight
most clusters were classified incorrectly as a result of the
undersaturated images. To increase the quality of the im-
ages, a filter and a translation was added during post flight
analysis to remedy the effects of the misconfigured cam-
eras. The results of this post processing can be seen in
the Fig. 9 and Fig.10. With the improved images and
Figure 6. In flight original RBG image
Figure 7. In flight original infrared image
the improved classification database during the post flight
analysis, the algorithm is able to classify the clusters with
very few wrong classifications as shown in Fig. 11.
The example result Fig. 11 classifies the following fea-
Acreage and fields.
Border of the image
With a longer flight time, those results could have
been achieved with updates of the software on the pre-
processors applying image filters. Also, the classifica-
tion database could have been generated more detailed.
Nevertheless, the system worked on board and sent ac-
ceptable result images after the generation of an adequate
Figure 8. Result image based on in flight original raw
Figure 9. Post flight optimized RBG image
5.2. Data Transmission Analysis
The final data amount is reduced by minimizing the pos-
sible values for each pixel of the result image to the num-
ber of categorized features. Therefore, the value range
for each pixel gets reduced from 256 to the number of
features, which are in our case five. This means the
needed storage per bit is reduced to only three bits in-
stead of eight bits. With further compression the onboard
analyzed data get reduced even further. This leads to a
reduction of data per result image to 1-20 percentage of
the original data, for one sensor. For multiple sensors and
higher resolution, the data amount is reduced linear, since
instead of all the different source images only one result
image has to be sent. For four image sensors it is at the
maximum 5% of the original data amount. Furthermore,
the uplink for an algorithm database is very small. This
absolutely fulfills the goal of reducing the amount of data
which is produced and has to be sent to earth.
Fortunately, the overall project was very successful. Nev-
ertheless, no project runs perfect. Indeed a lot of issues
had to be solved and until the end still difficulties oc-
curred, which all were solved. In the following the most
Figure 10. Post flight optimized infrared image
Figure 11. Result image based on post flight optimized
raw images
important issues and the lessons learned from these are
6.1. System testing and verification
A lot of time was spent testing all systems in private. In
the end the time ran short for complete system tests. It
was more difficult to solve the found issues, when com-
bining the subsystems to one big system. For example,
there were issues with the pre-processors: They work ab-
solutely fine as single system communication as well as
image reception and processing. After integrating these
subsystems and activating the image sensors, the commu-
nication ran unreliable. In the end some pre-processors
had issues with handling both. Probably due to manu-
facturing difficulties. It turned out that the time schedule
for tests after combination of subsystems was too short.
Especially, the time for solving all issues. Therefore, it
is important to save enough time for testing the complete
system. Also, the dark rim on the images and the un-
dersaturated images could have been avoided with more
extensive tests after full integration.
6.2. Problem priorities
Due to trying to solve the compression issue outlined be-
fore (see section 5) the other tests were done more inac-
curate. Unfortunately, other issues were detected after-
wards, which could have been solved better. Focusing on
the compression issue took time, which could have been
used to solve problems, which would have revealed more
benefits with smaller effort. This means even while solv-
ing last errors in the system it should not be forgotten
to set the correct priorities. Even if almost all issues are
Summarized the project can be described successful in
the main objectives. The system was very agile to
changes and had no problems with loss of sensors, which
appeared during flight and testing, or other failures in dif-
ferent system parts. The modular concept also provided
big advantages for solving problems with sensors or pre-
processors. Switching to a spare part was possible with
very less effort. Also the experiment itself worked fine.
The pre-processing, the system control, and the algorithm
worked very fine onboard during flight. Also, the onboard
differentiation of landscapes worked, but due to the sen-
sor problems the onboard results were not really satisfy-
ing. Fortunately, with extensive post flight analysis the
algorithm results were good and the concept as whole is
proved to work. After this first concept approval the sen-
sor diversity can be increased, the power usage reduced
and the processing speed enhanced. All in all, the team
learned a lot, was able to test the concept successfully and
gathered important experiences.
This research project was supported by the
REXUS/BEXUS program. The QUEST team is
proud to be part of this program, enjoyed all the work-
shops a lot and is grateful to have flown on the BEXUS
27 mission in October 2018 from the Esrange Space
Center. The REXUS/BEXUS program is realized under
a bilateral Agency Agreement between the German
Aerospace Center (DLR) and the Swedish National
Space Board (SNSB). The Swedish share of the payload
has been made available to students from other European
countries through the collaboration with the European
Space Agency (ESA). Experts from DLR, SSC, ZARM
and ESA provide technical support to the student teams
throughout the project. EuroLaunch, the cooperation
between the Esrange Space Center of SSC and the
Mobile Rocket Base (MORABA) of DLR, is responsible
for the campaign management and operations of the
launch vehicles. The authors also wish to thank the
endorsing professor H. Kayal (University of W¨
and the companies that provided technical and financial
support to the experiment team: Mouser Electronics,
Inc., Texas Instruments, Inc., Thorlabs Gmbh, ZARGES
Gmbh, Leica Geosystems AG, and µduino.
[1] Martin Ester et. Al, A Density-Based Algorithm
for Discovering Clusters, Institute for Computer Sci-
ence, University Munich, Germany, 1996
[2] K. Mumtaz et. Al, An Analysis on Density Based
Clustering of Multi Dimensional Spatial Data, In-
dian Journal of Computer Science and Engineering
ResearchGate has not been able to resolve any citations for this publication.
Full-text available
Mining knowledge from large amounts of spatial data is known as spatial data mining. It becomes a highly demanding field because huge amounts of spatial data have been collected in various applications ranging from geo-spatial data to bio-medical knowledge. The amount of spatial data being collected is increasing exponentially. So, it far exceeded human’s ability to analyze. Recently, clustering has been recognized as a primary data mining method for knowledge discovery in spatial database. The development of clustering algorithms has received a lot of attention in the last few years and new clustering algorithms are proposed. DBSCAN is a pioneer density based clustering algorithm. It can find out the clusters of different shapes and sizes from the large amount of data containing noise and outliers. This paper shows the results of analyzing the properties of density based clustering characteristics of three clustering algorithms namely DBSCAN, k-means and SOM using synthetic two dimensional spatial data sets.