Content uploaded by Ilias Gialampoukidis
Author content
All content in this area was uploaded by Ilias Gialampoukidis on Jun 19, 2017
Content may be subject to copyright.
Crater monitoring through social media observations
I. Gialampoukidis, S. Vrochidis and I. Kompatsiaris
Centre for Research and Technology Hellas, Information Technologies Institute (CERTH-ITI), Thessaloniki, Greece
E-mail: {heliasgj, stefanos, ikom}@iti.gr
Abstract
Lunar craters have attracted the attention of not only
scientists but also citizens. Modern high-resolution
cameras with zoom capabilities allow citizens to cap-
ture and share pictures of the Moon in Social Media
platforms, such as Twitter. We have collected 69 pic-
tures of the Moon, from 01-01-2017 to 17-04-2017,
that have been uploaded on Twitter and have been as-
sociated with the keyword #crater. The lunar pictures
are indexed using SIFT descriptors and are then clus-
tered using density-based approaches to group them
into the automatically detected levels of zoom.
1. Introduction
The automatic detection and classification of lunar
craters have been one of the most important challenges
among lunar experts. Several approaches have been
proposed [7] to detect or count craters, in order to as-
sist and accelerate the classification of space images.
Crater shapes are changing in time and their transition
to a more complex morphology has been investigated
[5]. Other approaches involve the extraction of visual
descriptors that are based on the Hough transform for
the detection and counting of craters [3]. Contrary to
the use of lunar catalogues of optical images [6], we
propose in this work the monitoring of crater activity
through social media observations.
2. Methodology and Results
We crawled from the Twitter API169 pictures of the
Moon, from 01-01-2017 to 17-04-2017, in response to
the keyword #crater. In Table 1 we present the number
of pictures per month and the daily coverage:
Daily coverage =number of pictures per month
number of days per month
(1)
1https://dev.twitter.com
From each crawled Twitter image, we extract salient
points using the Lip-vireo2tool, in order to index
all images using SIFT descriptors [4]. The Bag-
of-Visual-Words representation is followed [2] using
term frequency - inverse document frequency (tf-idf)
scores, using a visual vocabulary of 100 visual words,
obtained by k-means clustering with 30 iterations to
ensure convergence in the visual vocabulary creation.
Table 1: Uploaded pictures per month.
Month Pictures Daily Coverage
January 23 74.19%
February 19 67.86%
March 16 51.61%
April (1st-17th) 11 64.70%
After the indexing of each image, an OPTICS reach-
ability plot [1] is employed to visualize the cluster
structure, i.e. the number of clusters and the optimal
density level. The OPTICS reachability plot indicates
that the dataset has two density-connected groups of
pictures (clusters), which are extracted at the density
level = 0.05, while a lower bound for the number of
pictures per cluster is minP ts = 5. The adoption of a
density-based clustering approach allows the presence
of noise in the dataset and does not require a priori
knowledge of the number of clusters. In Figure 1 we
present a sample of the two detected groups of lunar
images and one additional group of images (noise).
3. Summary and Conclusions
We have collected more than one image per two days,
on average, in response to the keyword #crater. Each
one of the collected images has been clustered into
two main groups of images and an additional cluster
is provided (noise) with pictures that have not been
assigned to any cluster. The proposed lunar image
2http://pami.xmu.edu.cn/ wlzhao/lip-vireo.htm
EPSC Abstracts
Vol. 11, EPSC2017-25-1, 2017
European Planetary Science Congress 2017
c
Author(s) 2017
EPSC
European Planetary Science Congress
(a) (b) (c) (d) (e) (f) (g) (h)
(i) (j) (k) (l) (m)
(n) (o) (p) (q) (r)
Figure 1: Sample of the pictures per group. The salient points that have been detected are shown on each image.
The first group of pictures (a)-(h) shows a complete facet of the Moon, while the second group (i)-(m) zooms into
craters. The last group (n)-(r) belongs to the set of unassigned pictures to any of the first two groups (noise).
clustering process provides two classes of lunar pic-
tures, at different zoom levels; the first showing a clear
view of craters grouped into one cluster and the sec-
ond demonstrating a complete view of the Moon at
various phases that are correlated with the crawling
date. The clustering stage is unsupervised, so new
topics can be detected on-the-fly. We have provided
additional sources of planetary images using crowd-
sourcing information, which is associated with meta-
data such as time, text, location, links to other users
and other related posts. This content has crater infor-
mation that can be fused with other planetary data to
enhance crater monitoring. The classification of each
new Twitter picture is marked as relevant manually,
but we plan to train a binary classifier for that purpose.
Acknowledgements
This work has supported by the EC-funded project
KRISTINA (H2020-645012).
References
[1] Ankerst, M., Breunig, M. M., Kriegel, H. P., Sander,
J.: OPTICS: ordering points to identify the clustering
structure. In ACM SIGMOD record, Vol. 28, No. 2,
pp. 49–60, 1999.
[2] Feng, J., Jiao, L. C., Zhang, X., Yang, D.: Bag-of-
visual-words based on clonal selection algorithm for
SAR image classification. IEEE Geoscience and Re-
mote Sensing Letters, Vol. 8, No. 4, pp. 691–695,
2011.
[3] Galloway, M. J., Benedix, G. K., Bland, P. A., Pax-
man, J., Towner, M. C., Tan, T.: Automated crater de-
tection and counting using the hough transform. In Im-
age Processing (ICIP), 2014 IEEE International Con-
ference on, pp. 1579–1583, 2014.
[4] Lowe, D. G.: Distinctive image features from scale-
invariant keypoints.: International journal of computer
vision Vol. 60, No. 2, pp. 91–110, 2004.
[5] Mahanti, P., Robinson, M., Povilaitis, R.: Crater
shapes in transition – classification of lunar impact
craters in the 5 km to 20 km size range, European Plan-
etary Science Congress, 16–21 October, Pasadena,
CA, USA, 2016.
[6] Salamuni´
ccar, G., Lonˇ
cari´
c, S., Grumpe, A., Wöhler,
C.: Hybrid method for crater detection based on to-
pography reconstruction from optical images and the
new LU78287GT catalogue of Lunar impact craters.
Advances in Space Research, Vol. 53, Vo. 12, pp.
1783–1797, 2014
[7] Sawabe, Y., Matsunaga, T., Rokugawa, S.: Automated
detection and classification of lunar craters using mul-
tiple approaches. Advances in Space Research, Vol.
37, No. 1, pp. 21–27, 2006.