Article

A social media-based over layer on the edge for handling emergency-related events

Authors:
To read the full-text of this research, you can request a copy directly from the authors.

Abstract

Online Social Networks (OSNs), together with messaging services are tools for the exchange of entertainment-related information. However, they represent virtual environments capable of providing relevant information related to emergency or criminal events. Thanks to the simple way of using OSNs in combination to modern ubiquitous Internet of Things (IoT) smart devices, the generation and exploitation of such information is made available to users in real-time even more easily. Unfortunately, its reuse has not been taken into consideration yet due to the lack of specific models and related software tools. In this context, the paper presents a social media-based over layer for supporting the monitoring, detection, computation and information sharing of social media information related to emergency scenarios centered on smartphones and text mining techniques. The proposal is assessed through two different case studies, by evaluating the performances of different classifiers and by showing the logic of the functionalities of the related apps.

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the authors.

... Moreover, crossdepartments could take collaborative actions to better accomplish emergency tasks through knowledge sharing and information communication [10,11]. In addition, improving the practice of multimodal resources [12], identifying humanitarian information [13], and adjusting management roles [14] can also support the efficiency of CDCEM. ...
Article
Full-text available
Cross-Department Coordination of Emergency Management (CDCEM) is considered a critical dimension in China to solve the problem of emergency management. The Decision Experiment and Decision-Making Trial and Evaluation Laboratory (DEMATEL) is a method used to build the structural correlation of criteria in uncertain environments to identify critical success factors (CSFs). There are coupling correlations and one-way correlations for interrelationship comparisons between selected factors of CDCEM. Therefore, there are two different assessment scales. However, most previous studies applied the DEMATEL method with a single assessment scale to identify CSFs. To fill this gap, an IFS-IVIFS-DEMATEL method is provided to comprehensively identify the CSFs of CDCEM in this study. The intuitionistic fuzzy set (IFS) is regarded as the assessment scales of coupling correlation, and the interval-valued intuitionistic fuzzy set (IFIVS) is regarded as the assessment scales of one-way correlation. The two different types of assessment scales were transformed into interval information in the improved approach. Then, using the conduction correlation among factors, a comprehensive correlation matrix was constructed. After that, the ranking of the central degree and cause degree of the factors according to the traditional DEMATEL method was obtained. Finally, a case study of Nanjing’s CDCEM was illustrated to demonstrate that the proposed method is more suitable and reasonable. It is found that the factors of “cross-department organization”, “cross-department information communication and transmission”, “information sharing technology platform”, “cross-department material supply capability”, and “cross-department prediction and early warning” in Nanjing are CSFs in CDCEM, which should be emphasized to strengthen CDCEM. The findings of this study shed light on the cross-department coordination of emergency management mechanisms in uncertain situations, which would be beneficial for improving the efficiency of governmental management.
Preprint
Full-text available
Text mining is a technological trend often highlighted in the continuous exchange of information through interconnected media. Its applicability goes beyond private organizations, as the public sector also requires it to treat textual information regarding services offered. Within this scenario, public security emerges as a prominent user of text mining that seeks to ensure the construction of data and knowledgebases to support decision-making about law enforcement actions to ensure citizen welfare. The primary objectives of this article are: (i) to develop a survey to identify text mining applications, techniques, opportunities, and challenges in public security, and (ii) to outline research directions concerning these topics and provide insights so that interested researchers can develop new studies. The literature was searched within four databases: Scopus, IEEE Xplore, ACM Digital Library, and Web of Science. A filtering process was applied to extract the works most aligned with the target theme, resulting in the selection of 194 of the most relevant works for a literature review. There were identified nineteen key applications of text mining related to public security and the most recurrent techniques and technologies reported between 2014 to 2021, supporting outlining three axes for future directions: one with possible expansion of objectives for new research; another on changes and adaptations in scopes for the methodological context; and the last one on expansions and changes in application scenarios based on the literature.
Article
Full-text available
Criminals and related illegal activities represent problems that are neither trivial to predict nor easy to handle once they are identified. The Police Forces (PFs) typically base their strategies solely on their intra-communication, by neglecting the involvement of third parties, such as the citizens, in the investigation chain which results in a lack of timeliness among the occurrence of the criminal event, its identification, and intervention. In this regard, a system based on IoT social devices, for supporting the detection and tracking of criminals in the real world, is proposed. It aims to enable the communication and collaboration between citizens and PFs in the criminal investigation process by combining app-based technologies and embracing the advantages of an Edge-based architecture in terms of responsiveness, energy saving, local data computation, and distribution, along with information sharing. The proposed model as well as the algorithms, defined on the top of it, have been evaluated through a simulator for showing the logic of the system functioning, whereas the functionality of the app was assessed through a user study conducted upon a group of 30 users. Finally, the additional advantage in terms of intervention time was compared to statistical results.
Conference Paper
Full-text available
When a terror-related event occurs, there is a surge of traffic on social media comprising of informative messages, emotional outbursts, helpful safety tips, and rumors. It is important to understand the behavior manifested on social media sites to gain a better understanding of how to govern and manage in a time of crisis. We undertook a detailed study of Twitter during two recent terror-related events: the Manchester attacks and the Las Vegas shooting. We analyze the tweets during these periods using (a) sentiment analysis, (b) topic analysis, and (c) fake news detection. Our analysis demonstrates the spectrum of emotions evinced in reaction and the way those reactions spread over the event timeline. Also, with respect to topic analysis, we find “echo chambers”, groups of people interested in similar aspects of the event. Encouraged by our results on these two event datasets, the paper seeks to enable a holistic analysis of social media messages in a time of crisis.
Conference Paper
Full-text available
The Philippines is battered by different natural and man-made disasters. It cannot be disputed that despite its magnificent and natural beauty, the country suffers from these devastations. Although the government has set in mitigation and prevention plans, little is put into consideration when the disaster happens. The paper focuses on the application of Information Communications Technology (ICT) in a form of an android based mobile application that gives victims the capability to seek help when a disaster or incident strikes. In addition, people can notify others of the danger ahead through AppLERT and Facebook so that they can avoid the area where the danger is through crowdsourcing. It seeks to help expedite the response time of the responding unit using the user's mobile phone built-in GPS.
Article
Full-text available
We show how a disruptive force in mobile computing can be created by extending today???s unmodified cloud to a second level consisting of self-managed data centers with no hard state called cloudlets. These are located at the edge of the Internet, just one wireless hop away from associated mobile devices. By leveraging lowlatency offload, cloudlets enable a new class of real-time cognitive assistive applications on wearable devices. By processing high data rate sensor inputs such as video close to the point of capture, cloudlets can reduce ingress bandwidth demand into the cloud. By serving as proxies for distant cloud services that are unavailable due to failures or cyberattacks, cloudlets can improve robustness and availability. We caution that proprietary software ecosytems surrounding cloudlets will lead to a fragmented marketplace that fails to realize the full business potential of mobile-cloud convergence. Instead, we urge that the software ecosystem surrounding cloudlets be based on the same principles of openness and end-to-end design that have made the Internet so successful.
Article
With the ever-increasing diffusion of smart devices and Internet of Things (IoT) applications, a completely new set of challenges have been added to the Data Mining domain. Edge Mining and Cloud Mining refer to Data Mining tasks aimed at IoT scenarios and performed according to, respectively, Cloud or Edge computing principles. Given the orthogonality and interdependence among the Data Mining task goals (e.g., accuracy, support, precision), the requirements of IoT applications (mainly bandwidth, energy saving, responsiveness, privacy preserving, and security) and the features of Edge/Cloud deployments (de-centralization, reliability, and ease of management), we propose EdgeMiningSim, a simulation-driven methodology inspired by software engineering principles for enabling IoT Data Mining. Such a methodology drives the domain experts in disclosing actionable knowledge, namely descriptive or predictive models for taking effective actions in the constrained and dynamic IoT scenario. A Smart Monitoring application is instantiated as a case study, aiming to exemplify the EdgeMiningSim approach and to show its benefits in effectively facing all those multifaceted aspects that simultaneously impact on IoT Data Mining.
Article
Human activity recognition (HAR) has attracted enormous research interest thanks to its fundamental importance in several domains spanning from health-care to security, safety, and entertainment. Robust and consolidated literature focused on the study of activities performed by single individuals, with a great variety of approaches in terms of sensing modalities, recognition techniques, a specific set of recognized activities, and final application objectives. However, much less research attention has been devoted to scenarios in which multiple people perform individual or joint actions and activities forming groups to achieve given common goals. This problem is often referred to as multi-user activity recognition. With the advent of the Internet-of-Things, smart objects are being pervasively spread in the environment and worn on the human body, enabling contextual and distributed recognition of group and multi-user activities. Therefore, this survey discusses clear motivations and advantages of multi-user activity recognition based on sensing methods, recognition approaches, and practical applications with attention to related data fusion challenges and techniques. By identifying the critical aspects of this multi-faceted problem, the survey aims to provide a systematic categorization and comparison framework of the state-of-the-art that drives the discussion to important open research challenges and future directions.
Conference Paper
Social networks are quickly becoming the primary medium for discussing what is happening around real-world events. The information that is generated on social platforms like Twitter can produce rich data streams for immediate insights into ongoing matters and the conversations around them. To tackle the problem of event detection, we model events as a list of clusters of trending entities over time. We describe a real-time system for discovering events that is modular in design and novel in scale and speed: it applies clustering on a large stream with millions of entities per minute and produces a dynamically updated set of events. In order to assess clustering methodologies, we build an evaluation dataset derived from a snapshot of the full Twitter Firehose and propose novel metrics for measuring clustering quality. Through experiments and system profiling, we highlight key results from the offline and online pipelines. Finally, we visualize a high profile event on Twitter to show the importance of modeling the evolution of events, especially those detected from social data streams.
Article
The ever-increasing growth of connected smart devices and IoT verticals is leading to the crucial challenges of handling the massive amount of raw data generated by distributed IoT systems and providing timely feedback to the end-users. Although the existing cloud computing paradigm has an enormous amount of virtual computing power and storage capacity, it might not be able to satisfy delay-sensitive applications since computing tasks are usually processed at the distant cloud-servers. To this end, edge/fog computing has recently emerged as a new computing paradigm that helps to extend cloud functionalities to the network edge. Despite several benefits of edge computing including geo-distribution, mobility support and location awareness, various communication and computing related challenges need to be addressed for future IoT systems. In this regard, this article provides a comprehensive view of the current issues encountered in distributed IoT systems and effective solutions by classifying them into three main categories, namely, radio and computing resource management, intelligent edge-IoT systems, and flexible infrastructure management. Furthermore, an optimization framework for edge-IoT systems is proposed by considering the key performance metrics including throughput, delay, resource utilization and energy consumption. Finally, an ML based case study is presented along with some numerical results to illustrate the significance of ML in edge-IoT computing.
Conference Paper
The rapid growth of the Internet in the last years has brought many advantages in the modern society in terms of communication and information sharing. Beside that, new and complex issues are emerging due to the network flexibility, openness and systems integration. The vulnerabilities of systems are the basis of these issues. Unfortunately, such vulnerabilities in the Internet can affect not only virtual environments in an isolated way but this can have serious repercussions in the real world. That is why, identifying new system vulnerability represents an important information for malicious parties. Currently, several tools (e.g. Shodan or Censys), which automatically scan the Internet, are available. They first scan the whole IPv4 public address range and ports in a distributed and random manner and then the obtained results are published on the publicly accessible websites. Such information can be later used for the benign or malicious purposes. In the latter case the main advantage for the potential attackers is that they gain reconnaissance data without even directly contacting the targeted device. Additionally, a large list of potential victims sharing the same vulnerability can be rapidly acquired. In this context, this paper aims at providing an overview of various publicly available network vulnerabilities scanning tools. In particular, first the main scanning tools are identified and classified. Then their main features are described and finally their advantages and disadvantages are highlighted.
Article
More and more common activities are leading to a sedentary lifestyle forcing us to sit several hours every day. In-seat actions contain significant hidden information, which not only reflects the current physical health status but also can report mental states. Considering this, we design a system, based on body-worn inertial sensors (attached to user's wrists) combined with a pressure detection module (deployed on the seat), to recognise and monitor in-seat activities through sensor- and feature-level fusion techniques. Specifically, we focus on four common basic emotion-relevant activities (i.e. interest-, frustration-, sadness- and happiness-related). Our results show that the proposed method, by fusion of time- and frequency-domain feature sets from all the different deployed sensors, can achieve high accuracy in recognising the considered activities.
Conference Paper
Social media usage is improving every day and new social media websites are being published one by one each day. The huge amount of data generated on social media websites is like a treasure for research and analysis. These data must be processed to reach the information which is aimed. An example application is developed in this study to intelligence on Twitter. Nearly 150 thousand tweets data on Turkish Language are taken from Twitter between specified dates and processed by Turkish Zemberek-NLP (Natural Language Processing) and the relation between data is announced. The crimes are classified according to TUIK (Turkish Statistical Institute) criminals' data and keywords are defined based on these data. Analyze based on these data was applied and results were announced. It is seen that bomb attacks and terror events are spoken on Twitter in Turkey mostly during study dates. Bigger masses can be reachable by expanding keywords group and pulse of community can be listened so, various measures can be taken by doing necessary follows.
Conference Paper
Due to the rapid growth of population in the last 20 years, an increased number of instances of heavy recurrent traffic congestion has been observed in cities around the world. This rise in traffic has led to greater numbers of traffic incidents and subsequent growth of non-recurrent congestion. Existing incident detection techniques are limited to the use of sensors in the transportation network. In this paper, we analyze the potential of Twitter for supporting real-time incident detection in the United Kingdom (UK). We present a methodology for retrieving, processing, and classifying public tweets by combining Natural Language Processing (NLP) techniques with a Support Vector Machine algorithm (SVM) for text classification. Our approach can detect traffic related tweets with an accuracy of 88.27%.
Article
Technological evolution of mobile user equipments (UEs), such as smartphones or laptops, goes hand-in-hand with evolution of new mobile applications. However, running computationally demanding applications at the UEs is constrained by limited battery capacity and energy consumption of the UEs. Suitable solution extending the battery life-time of the UEs is to offload the applications demanding huge processing to a conventional centralized cloud (CC). Nevertheless, this option introduces significant execution delay consisting in delivery of the offloaded applications to the cloud and back plus time of the computation at the cloud. Such delay is inconvenient and make the offloading unsuitable for real-time applications. To cope with the delay problem, a new emerging concept, known as mobile edge computing (MEC), has been introduced. The MEC brings computation and storage resources to the edge of mobile network enabling to run the highly demanding applications at the UE while meeting strict delay requirements. The MEC computing resources can be exploited also by operators and third parties for specific purposes. In this paper, we first describe major use cases and reference scenarios where the MEC is applicable. After that we survey existing concepts integrating MEC functionalities to the mobile networks and discuss current advancement in standardization of the MEC. The core of this survey is, then, focused on user-oriented use case in the MEC, i.e., computation offloading. In this regard, we divide the research on computation offloading to three key areas: i) decision on computation offloading, ii) allocation of computing resource within the MEC, and iii) mobility management. Finally, we highlight lessons learned in area of the MEC and we discuss open research challenges yet to be addressed in order to fully enjoy potentials offered by the MEC.
Article
Social media is a rich source of up-to-date information about events such as incidents. The sheer amount of available information makes machine learning approaches a necessity to process this information further. This learning problem is often concerned with regionally restricted datasets such as data from only one city. Because social media data such as tweets varies considerably across different cities, the training of efficient models requires labeling data from each city of interest, which is costly and time consuming. To avoid such an expensive labeling procedure, a generalizable model can be trained on data from one city and then applied to data from different cities. In this paper, we present Semantic Abstraction to improve the generalization of tweet classification. In particular, we derive features from Linked Open Data and include location and temporal mentions. A comprehensive evaluation on twenty datasets from ten different cities shows that Semantic Abstraction is indeed a valuable means for improving generalization. We show that this not only holds for a two-class problem where incident-related tweets are separated from non-related ones but also for a four-class problem where three different incident types and a neutral class are distinguished. To get a thorough understanding of the generalization problem itself, we closely examined rule-based models from our evaluation. We conclude that on the one hand, the quality of the model strongly depends on the class distribution. On the other hand, the rules learned on cities with an equal class distribution are in most cases much more intuitive than those induced from skewed distributions. We also found that most of the learned rules rely on the novel semantically abstracted features.
Conference Paper
Data accumulated from social media like Facebook, Twitter, Google+ etc. provides real-time information, which can help in preventing unanticipated happenings to a great extent. To acknowledge the existence of Twitter in generating valuable information on versatile domains, ‘Terrorism’ has been chosen as the domain of study. For studying an acute issue like terrorism, it's a prerequisite to study about ‘topical experts’ for uncovering some reliable and trustworthy sources of information on social media. List, which is a crowd-sourced feature of Twitter, is utilized in recognizing the topical experts. The study provides about 1000 experts by crawling only 57 lists. The collected tweets from terrorism experts have been classified with 48.89% accuracy. Further, on testing the classifier, 66.67% precision and 88.89% recall is obtained. The study has been validated using real-data set with 73.333% accuracy.
Article
As the adoption of embedded systems, mobiles and other smart devices keeps rising, and the scope of their involvement broadens, for instance in the enablement of Smart City-like scenarios, a pressing need emerges to tame such complexity and reuse as much tooling as possible without resorting to vertical ad-hoc solutions, while at the same time taking into account valid options with regards to infrastructure management, and other more advanced functionalities. In this sense, a widely used and competitive framework for Infrastructure as a Service such as OpenStack, with its breadth in terms of feature coverage and expanded scope, looks like fitting the bill. This work thus describes rationale, efforts, and results so far achieved, for an integration of IoT paradigms and resource ecosystems with such a kind of Cloud-oriented environment, by focusing on a Smart City scenario, and featuring data collection and visualization as example use cases of such integration.
Conference Paper
European Law Enforcement Agencies are increasingly more reliant on information and communication technologies and are affected by a society shaped by the Internet and social media. The richness and quantity of information available from open sources, if properly gathered and processed, can provide valuable intelligence and help drawing inference from existing closed source intelligence. CAPER is an Open Source INTelligence platform for the prevention of organized crime, created in cooperation with European LEAs. CAPER supports information sharing and multi-modal analysis of open and closed information sources, mainly based on Natural Language Processing (NLP) and Visual Analytics (VA) technologies.