Article

Energy Consumption Analysis for Bluetooth, WiFi and Cellular Networks

Authors:
To read the full-text of this research, you can request a copy directly from the author.

Abstract

This document analyzes average energy consumption of Bluetooth, WiFi (802.11) and cellular networks for transmitting data produced at f bytes per second. It is assumed that a packet is created every t buf seconds and sent to the respective module for transmission. Thus, data produced by an application in t buf is given by d = t buf * f bytes, neglecting packet overhead. The different energy and current values used in this report are either taken from data sheets, published papers, or provided by the vendor (in case of Bluetooth).

No full-text available

Request Full-text Paper PDF

Request the article directly
from the author on ResearchGate.

... The trades between using these two technologies are the followings. With WiFi, a WBAN user will be able to transmit the data packet to the cloud with low power and low delay compared with cellular technology [26], but with transmission range does not exceed 100 m [27]. Such WiFi capability is crucial to support the power constrained in WBAN sensors while successfully transmitting data to the cloud system. ...
... It was shown that, via WiFi technology the transmission power and delay are 10 times less compared with transmission using cellular technology (e.g. 3G and LTE) with same packet size [26,28,29]. While the cellular connection is very costly in terms of power, transmission delay and connection cost (the WiFi is mostly free of charge), it is very important to support WBAN users mobility in case of the absence of the cloudlet in the range and to support the scalability of the system under large number of WBAN users [8,9,30], as we will discuss in Section 4. ...
... Each experiment in these results lasted for 3600 s in an area of 400 × 600 m. 400 human subjects are moving in the area with random way point mobility with speed of 2 m/s and a random pause time of 1-10 s and each user sends a packet of size 46 Bytes to the cloud with a period of 10 s. Remember, it was shown that, via WiFi, the transmission power of a data packet of size 46 Bytes will cost about 30 mW [26,28,29] and with a delay of 0.045 ms. While, via cellular network connection (e.g. ...
... This clearly show that the energy consumption within an MT can be modeled as a constant energy per unit time multiplied by the total time for which a wireless interface is active; this time depends on the channel rate that can take different values, which are based on adaptive rate control. Whereas, some other works assume that the energy consumed per unit time depends on the transmission rate, i.e., E(R), as shown in [24] and [25]. In this case, the energy consumed per unit time is related to the transmission rate using a lookup table for the various transmission rates [24], [25]. ...
... Whereas, some other works assume that the energy consumed per unit time depends on the transmission rate, i.e., E(R), as shown in [24] and [25]. In this case, the energy consumed per unit time is related to the transmission rate using a lookup table for the various transmission rates [24], [25]. It is evident in both works [24] and [25] that, as the transmission rate R increases, the energy per bit E(R)/R decreases, which favors the use of the highest transmission rate between any two communicating entities achieved when the MT is transmitting with its highest transmission power capability P t . ...
... In this case, the energy consumed per unit time is related to the transmission rate using a lookup table for the various transmission rates [24], [25]. It is evident in both works [24] and [25] that, as the transmission rate R increases, the energy per bit E(R)/R decreases, which favors the use of the highest transmission rate between any two communicating entities achieved when the MT is transmitting with its highest transmission power capability P t . ...
Article
In this paper, we address the problem of optimal energy-aware content distribution over wireless networks with mobile-to-mobile cooperation. Given a number of mobile terminals (MTs) interested in downloading a common content via a base station (BS), the MTs are grouped into cooperative groups or coalitions. Within each coalition, an optimally chosen coalition head downloads the content from the BS and either unicasts or multicasts it to the other MTs. The centralized optimization formulations are derived for both unicasting and multicasting among the MTs, along with the suitable simplifications to reduce the complexity of the optimization formulations. Then, a polynomial time heuristic algorithm is proposed to solve the optimization problems for relatively large networks where the optimal solution becomes computationally complex. Furthermore, a distributed algorithm, which is based on coalitional game theory, is developed to allow the MTs to choose, independently, which coalitions to join. Performance results for various scenarios demonstrate that the proposed algorithms lead to significant reduction in the total energy consumed by the MTs. In addition, the proposed centralized and distributed algorithms are shown to have relatively low complexity while achieving a near-optimal performance.
... The following are the trades between using these two communication technologies. A WBAN user can transmit the data packet to the cloud with low power and low delay using Wi-Fi technology compared with the 3G or the LTE cellular communication [25], [26], but with transmission range with Wi-Fi does not exceed 100m [27]. The capability of Wi-Fi is essential to have efficient power consumption in WBAN sensors and transmitting data to the cloud system in a successful manner. ...
... The Wi-Fi technology in our implementation should be available in the cloudlet area. Via Wi-Fi, it was shown that the transmission power of a data packet of size 46 Bytes will cost about 30 mw [7], [25], [28] and with data packet delay of 0.045 ms. On the other hand, cellular communication technologies, like 3G or LTE has longer transmission rage and the user is capable to transmit the data packet to the enterprise cloud from any position that is cover by cellular network. ...
... Therefore, cellular communication technology has usually wider geographic area compared with the Wi-Fi technology. While, the transmission power of data packet of size 46 Bytes and using Wi-Fi technology will cost 300 mw and with a delay of 0.45 ms [25], [28]. Therefore, the cellular communication is very expensive in terms of power and delay compared with Wi-Fi, which is mostly free of charge. ...
Article
This paper presents an efficient large scale data collection in Wireless Body Area Network (WBANs) in the presence of cloudlet-based prototype system. The key contribution of this paper is to collect the observed data of WBANs in a large scale and convey it in consistent manner to the other end of service providers. A model of WBANs is proposed in this work including virtualized machines and Cloudlet in order to characterize the efficient WBANs data collection. A scalable storage and processing infrastructure have been proposed to support large scale WBANs system, which is efficiently capable to handle the big data generated by large number of WBANs users. The proposed model supports effective cost communication technologies through Wi-Fi technology. Performance results of the proposed prototype are evaluated using advanced CloudSim simulator. The performance results show that the consumed power and packet delay of the collected data is decreased by increasing the number virtualized machine and cloudlets in the monitored area. The results show also that the performance depends on the way of the virtualized cloudlet distribution in the target area for a given number of users.
... The trades between using these two technologies are the followings. With WiFi, a WBAN user will be able to transmit the data packet to the cloud with low power and low delay compared with cellular technology [29], but with transmission range does not exceed 100 m [30]. Such WiFi capability is crucial to support the power constrained in WBAN sensors while successfully transmitting data to the cloud system. ...
... In our implementation, the WiFi technology will be available in the cloudlet area. It was shown that, via WiFi, the transmission power of a data packet of size 46 Bytes will cost about 30 mw [29,31,32] and with a delay of 0.045 ms. On the other hand, a longer transmission range cellular network connection (e.g. ...
... 3G and LTE) is capable of transmitting the data packet to the cloud from any location that is cover by cellular network, which is usually a wider geographic area compared with the WiFi. It was shown that, via cellular, the transmission power of data packet of size 46 Bytes will cost about 300 mw and with a delay of 0.45 ms [29,31,32]. While the cellular connection is very costly in terms of power, transmission delay and connection cost (the WiFi is mostly free of charge), it's very important to support WBAN users mobility in case of the absence of the cloudlet in the range and to support the scalability of the system under large number of WBAN users, as we will discuss in Section 4. ...
Article
Wireless Body Area networks (WBANs) have developed as an effective solution for a wide range of healthcare military and sports applications. Most of the proposed works studied efficient data collection from individual and traditional WBANs. Cloud computing is a new computing model that is continuously evolving and spreading. This paper presents a novel cloudlet-based efficient data collection prototype system in WBANs. The goal is to have a large scale of monitored data of WBANs to be available at the end user or to the service provider in reliable manner. A prototype of WBANs, including Virtualized Machine (VM) and Virtualized Cloudlet (VC) has been proposed for simulation characterizing efficient data collection in WBANs. Using the prototype system, we provide a scalable storage and processing infrastructure for large scale WBANs system. This infrastructure will be efficiently able to handle the large size of data generated by the WBANs system, by storing these data and performing analysis operations on it. The proposed model is fully supporting for WBANs system mobility using cost effective communication technologies of WiFi and cellular which are supported by WBANs and VC systems. This is in contrast of many of available mHealth solutions that is limited for high cost communication technology, such as 3G and LTE. Performance of the proposed prototype is evaluated via an extended version of CloudSim simulator. It is shown that the average power consumption and delay of the collected data is tremendously decreased by increasing the number of VMs and VCs.
... The trade between using these two technologies are the followings. With WiFi, a BAN user will be able to transmit the data packet to the cloud using smart phone with low power and low delay compared with cellular technology, but with transmission range does not exceed 100m [17]. Such WiFi capability is crucial to support the power constrain in BAN sensors while successfully transmitting data to the cloud system. ...
... In our implementation, the WiFi technology will be available in the cloudlet area. It was shown that, via WiFi, the transmission power of data packet will cost 30 mw with delay of 0.045 ms [17]. On the other hand, a longer transmission range cellular network connection (e.g. ...
... 3G and LTE) is capable of transmitting the data packet to the cloud from any location that is cover by cellular network, which is usually a wider geographic area compared with the WiFi. It was shown that, via cellular, the transmission power of data packet will cost 300 mw with delay of 0.45 ms [17]. While the cellular connection is very costly in terms of power, transmission delay and connection cost (the WiFi is mostly free of charge), it's very important to support BAN users mobility in case of the absence of the cloudlet in the range and to support the scalability of the system under large number of BAN users. ...
Conference Paper
This paper presents a large scale BANs system in the presence of cloudlet-based data collection. The objective is to minimize end-to-end packet cost by dynamically choosing data collection to the cloud using cloudlet based system. The goal is to have the monitored data of BANs to be available to the end user or to the service provider in reliable manner. While reducing packet-to-cloud energy, the proposed work also attempts to minimize the end-to-end packet delay by choosing dynamically a neighbor cloudlet, so that the overall delay is minimized. Then, it will lead to have the monitored data in the cloud in real time manner. Note that, in the absence of network congestions in low data-rate BANs, the storage delays due to data collection manner are usually much larger compared to the congestion delay.
... The following are the trades between using these two communication technologies. A BAN user can transmit the data packet to the cloud with low power and low delay using Wi-Fi technology compared with the 3G or the LTE cellular communication [19], but with transmission range with Wi-Fi does not exceed 100m [20]. The capability of Wi-Fi is essential to have efficient power consumption in BAN sensors and transmitting data to the cloud system in a successful manner. ...
... The Wi-Fi technology in our implementation should be available in the cloudlet area. Via Wi-Fi, it was shown that the transmission power of a data packet of size 46 Bytes will cost about 30 mw [7], [19], [21] and with data packet delay of 0.045 ms. On the other hand, cellular communication technologies, like 3G or LTE has longer transmission rage and the user is capable to transmit the data packet to the enterprise cloud from any position that is cover by cellular network. ...
... Therefore, cellular communication technology has usually wider geographic area compared with the Wi-Fi technology. While, the transmission power of data packet of size 46 Bytes and using Wi-Fi technology will cost 300 mw and with a delay of 0.45 ms [19], [21]. Therefore, the cellular communication is very expensive in terms of power and delay compared with Wi-Fi, which is mostly free of charge. ...
Conference Paper
In this paper we present an efficient big data collection model in Body Area Network (BANs) using cloudlet-based system prototype. The novelty of the proposed work is to have the monitored data of BANs in a large scale and deliver it in reliable manner to the service providers. A prototype of BANs is proposed in this paper to include virtualized machines and Cloudlet in order to characterize the efficient BAN data collection. A scalable storage and processing infrastructure have been proposed to support large scale BANs system, which is efficiently capable to handle the big data generated by BANs users. The model supports effective cost communication technologies through Wi-Fi technology. Performance results of the proposed prototype are evaluated using advanced CloudSim simulator. The performance results show the consumed power and packet delay of the collected data is decreased by increasing the number virtualized machine and Cloudlets.
... The following are the trades between using these two communication technologies. A BAN user can transmit the data packet to the cloud with low power and low delay using Wi-Fi technology compared with the 3G or the LTE cellular communication [19], but with transmission range with Wi-Fi does not exceed 100m [20]. The capability of Wi-Fi is essential to have efficient power consumption in BAN sensors and transmitting data to the cloud system in a successful manner. ...
... The Wi-Fi technology in our implementation should be available in the cloudlet area. Via Wi-Fi, it was shown that the transmission power of a data packet of size 46 Bytes will cost about 30 mw [7], [19], [21] and with data packet delay of 0.045 ms. On the other hand, cellular communication technologies, like 3G or LTE has longer transmission rage and the user is capable to transmit the data packet to the enterprise cloud from any position that is cover by cellular network. ...
... Therefore, cellular communication technology has usually wider geographic area compared with the Wi-Fi technology. While, the transmission power of data packet of size 46 Bytes and using Wi-Fi technology will cost 300 mw and with a delay of 0.45 ms [19], [21]. Therefore, the cellular communication is very expensive in terms of power and delay compared with Wi-Fi, which is mostly free of charge. ...
... The trade between using these two technologies is the following. With WiFi, a BAN user will be able to transmit the data packet to the cloud using smart phone with low power and low delay compared with cellular technology, but with transmission range does not exceed 100m 14 . Such WiFi capability is crucial to overcome the power constraint in mobile environment while successfully transmitting data to the cloud system. ...
... In our implementation, the WiFi technology will be available in the cloudlet area. It was shown that, via WiFi, the transmission power of data packet will cost 30 mw with delay of 0.045 ms 14 . On the other hand, a longer transmission range cellular network connection (e.g. ...
... 3G and LTE) is capable of transmitting the data packet to the cloud from any location that is covered by cellular network, which is usually a wider geographic area compared with the WiFi. It was shown that, via cellular connection, the transmission power of data packet will cost 300 mw with delay of 0.45 ms 14 . While the cellular connection is very costly in terms of power, transmission delay and connection cost (the WiFi is mostly free of charge), it's very important to support mobile users in case of the absence of the cloudlet in the range and to support the scalability of the system under large number of users. ...
Conference Paper
Mobile cloud computing is an emerging and fast-growing computing paradigm that has gained great interest from both industry and academia. Consequently, many researchers are actively involved in cloud computing research projects. One major challenge facing mobile cloud computing researchers is the lack of a comprehensive experimental framework to use in their experiments and to evaluate their proposed work. This paper introduces a modeling and simulation environment for mobile cloud computing. The experimental framework can be used to evaluate a wide spectrum of mobile cloud components such as processing elements, storage, networking, applications, etc. The framework is built on top of the CloudExp framework which provides the major building blocks needed for any cloud system. Moreover, mobile cloud experimental framework can exploit CloudExp capabilities to simulate big data generation and processing scenarios. An experimental scenario is also introduced in this paper to demonstrate the capabilities of the proposed framework.
... The differences between using these two communication technologies are the followings. A WBAN user with WiFi, is able to transmit the data to the cloud with low delay and low power compared with cellular network technology [41] , but with limited transmission range of 100 m [42] . The successful transmitting data to the cloud service is supported by WiFi capability due to supporting the power constrained in WBAN sensors. ...
... In our prototype system, the WiFi technology is available by cloudlet in the covered area. Via WiFi, it was shown that, the transmission delay of 46 Bytes data packet will take roughly 0.045 ms [41,43] and with a power cost of 30 mw. On the other hand, a cellular network connection (e.g. ...
... On the other hand, a cellular network connection (e.g. 3G, 4G and LTE) of longer transmission range, it is capable of sending the data packet to the cloud service from any location that is covered by cellular technology, which is geographically wider in area compared with the covered area of WiFi technology [41,43] . Via cellular technology, it was shown that the transmission delay of 46 Bytes data packet will take roughly 0.45 ms and with a power cost of 300 mw [41,43] . ...
... Current applications that are dealing with real world challenges are generating a huge amount of data and required tremendous computing resources. The exploitation of cloud advanced services for HPC and data intensive application (e.g., big data application) is a common practice that is supported by both academia and industry (Quwaider and Jararweh, 2013, 2014, 2015. ...
... The following are the trades between using these two communication technologies. A WBAN user can transmit the data packet to the cloud with low power and low delay using Wi-Fi technology compared with the 3G or the LTE cellular communication (Balani, 2007;Zhang et al., 2013), but with transmission range with Wi-Fi does not exceed 100 m (Joseph et al., 2004). The capability of Wi-Fi is essential to have efficient power consumption in WBAN sensors and transmitting data to the cloud system in a successful manner. ...
... The Wi-Fi technology in our implementation should be available in the cloudlet area. Via Wi-Fi, it was shown that the transmission power of a data packet of size 46 Bytes will cost about 30 mw Balani, 2007;Dementyev et al., 2013) and with data packet delay of 0.045 ms. On the other hand, cellular communication technologies, like 3G or LTE has longer transmission rage and the user is capable to transmit the data packet to the enterprise cloud from any position that is cover by cellular network. ...
Article
Full-text available
This paper presents an efficient large-scale data collection in wireless body area network (WBAN) in the presence of cloudlet-based prototype system. The key contribution of this paper is to collect the observed data of WBANs in a large-scale and convey it in consistent manner to the other end of service providers. A model of WBANs is proposed in this work including virtualised machines and cloudlet in order to characterise the efficient WBANs data collection. A scalable storage and processing infrastructure is proposed to support large-scale WBANs system, which is efficiently capable to handle the big data generated by large number of WBANs users. The proposed model supports effective cost communication technologies through Wi-Fi technology. Performance results of the proposed prototype are evaluated using advanced CloudSim simulator. The performance results show that the consumed power and packet delay of the collected data are decreased by increasing the number of virtualised machines and cloudlets in the monitored area. The results show also that the performance depends on the method of the virtualised cloudlet distribution in the target area for a given number of users.
... Even when data links to a cellular network are available, mobile users can still take advantage of the local WiFi contacts to exchange large amounts of data faster and in CHAPTER 1. INTRODUCTION 3 a more energy efficient fashion than transmitting data over GSM. In fact, power analysis of different wireless interfaces shows that for high transmission rates GSM radio consumes considerably more power than WiFi radio [Balani, 2007]. ...
... All these peripheral equipments consume energy and place growing pressure on the device's battery life. On the other hand, advances in battery design have stalled for the past few years[Balani, 2007].With recent advances in mobile multimedia communication, efficient usage of bandwidth is considered as an important design criterion in addition to energy in developing new mobile technologies. Routing algorithms that are designed for mobile DTNs transmit data to its destination through intermediate nodes, also known as relays, which might not be interested in the data themselves. ...
... However, this model is not free of costs. Elsewhere [22], it was shown that transmissions using the cellular networks interface consume more energy than WiFi and Bluetooth. In addition, implementing these communication paradigms in infrastructure supported networks would require a stronger knowledge of the network, what could not be achieved without frequent message exchanges between the infrastructure and the mobile devices. ...
... Elsewhere [22], [44], it has been shown that long range wireless network interfaces like 3G or 4G consume significantly more power than short range like WiFi or Bluetooth. Therefore, the CF approach is expected to spend most battery on networking activities. ...
Conference Paper
Full-text available
In less than a decade, smartphones and mobile applications spread like wildfire and dramatically improved aspects of our professional and private lives, from efficiency to safety. However, these applications are still in their infancy and mostly provide mobile versions of online Internet services or arcade games. With the exception of simple location-based query applications, context-awareness is largely ignored. However, it is not hard to imagine advanced mobile social networking applications -- SNAPPs for short -- that could proactively assist users in everyday tasks, improving their quality of life. Such services would require massive data collection, processing and communication between mobile devices. Unfortunately, the current centralised communication paradigm represents a major barrier to such intense networking. In this paper, we claim that a fundamental paradigm shift in communication is required to allow such application to see the light of day. The paper claims that such a shift is possible and that it resides in moving towards decentralised communication by taking advantage of the largely untapped network, storage and processing power capabilities offered by idle mobile devices. The paper presents and discusses a number of research questions that must be addressed in order to achieve this paradigm shift.
... The fundamental problem with continuous data transmission is the energy requirement. A result from respective investigations into continuous data transmission at 1 kHz suggests that it can be supported for 24-h monitoring using a 1200-mAh battery [143]. ...
... The analysis presented in [143] regarding the power consumption and longevity of batteries pertains to transmission energy only, added to it the energy involved in pre-processing the physiological data at the sensor nodes including analogue-todigital conversion, quantization, filtering and the microcontroller operation would bring down the effective time of monitoring to 8-10 h, thereby making the entire system power hungry and affecting the life of the batteries. An increased battery capacity like the prismatic zinc-air battery-1800 mAh operating at 1.4 V used recently in the medical community would increase the respective sizes of the sensor nodes. ...
Chapter
This chapter explores the field of remote sensor systems using wearable technologies that play a significant role in monitoring activities of patients in home and community settings. The focus is on body area sensing networks incorporating the primary enabling technologies: sensors for capturing the physiological and kinematic data, and data analysis techniques for extracting the clinically relevant information. With respect to the StrokeBack project, the majority of this chapter is dedicated towards physical activity monitoring-a key component in stroke rehabilitation. In particular, the domain of upper limb rehabilitation is examined since reduction of upper limb motor function is a common effect of stroke and significantly impairs the performance of patients as they engage in activities of daily life. As an example, a case study is presented where different arm movements are recognized in real time using data from inertial sensors attached to the arm. Tracking the occurrences of specific arm movements (e.g. prescribed exercises) over time can give an indication of rehabilitation progress since the frequency of these movements is expected to increase as motor functionality improves. © Springer International Publishing Switzerland 2016. All rights reserved.
... However, this requires cooperation among the cellular users. Since the energy efficiency of D2D communication decreases with the increase in distance between the communicating users [8], [9], two cellular users who are located far away from each other may be better off downloading the content they need directly from the BS instead of cooperating with each other. For example, consider five cellular users {1, 2, 3, 4, 5} requesting the same file from the BS. ...
... So the sums of the U i terms are equal in v(S 1 ) + v(S 2 ) and v(S 1 ∪ S 2 ) + v(S 1 ∩ S 2 ). Hence, by (11), to show that (9) holds, it suffices to show that: ...
Preprint
We consider a set of cellular users associated with a base station (BS) in a cellular network that employs Device-to-device (D2D) communication. A subset of the users request for some files from the BS. Now, some of the users can potentially act as relays and forward the requested files, or partitions of files, from the BS to some of the requesting users (destination nodes) over D2D links. However, this requires cooperation among the cellular users. In this paper, we seek conditions under which users have an incentive to cooperate with each other. We model the above scenario using the frameworks of cooperative game theory and stable partitions in coalitional games. We consider two different models for file transfer within a coalition: (i) Model A, in which the BS can split a file into multiple partitions and send these partitions to different relays, which multicast the partitions to the destination nodes of the coalition, and (ii) Model B, in which for each file, the BS sends the entire file to a single relay, which multicasts it to the destination nodes of the coalition. First, we explore the question of whether it is beneficial for all the cellular users to cooperate, i.e., whether the grand coalition is stable. For this we use the solution concept of core from cooperative game theory. We show that, in general, the above coalitional game under Model A may have an empty core. Next, we provide conditions under which the core is always non-empty and a D_c-stable partition always exists. Also, we show that under Model B, the problem of assigning relays to destination nodes so as to maximize the sum of utilities of all the users is NP-Complete. Finally, we show via numerical computations that a significant reduction in the energy expenditure of cellular users can be achieved via cooperation.
... A result from respective investigations into continuous data transmission at 1 kHz suggests that it can be supported for 24 hr. monitoring using a 1200 mAh battery [138]. ...
... The analysis presented in [138] regarding the power consumption and longevity of batteries pertains to transmission energy only, added to it the energy involved in preprocessing the physiological data at the sensor nodes including analog to digital Considering Bluetooth as the primary means of communication, the energy dissipation is directly dependant on the packet format of the data being transmitted which can be optimized using standard duty cycling and might eventually lead to delays and packet loss of data which would be highly undesirable for applications involving remote health monitoring [33]. Therefore, from the long-term system operation perspective, when implementing a wireless body area network (WBAN) comprising of heterogeneous sensors, it is imperative to select data analysis algorithms having low-computational complexity. ...
Thesis
Full-text available
ICT enabled body-worn remote rehabilitation system has been projected as an effective means for combating the major socio-economic challenge resulting from the need for quality care delivery for stroke survivors. The two major problems faced in such systems are: 1) while effective for characterising the patient’s performance during a constrained “exercise phase” in remote settings, the more natural indicator of rehabilitation status, i.e., the patient’s performance in an “unconstrained nomadic environment”, are often not considered and; 2) being body-worn and thus constrained by the battery life, their sustainability for long-term continuous monitoring is questionable. These shortcomings motivated the: 1) exploration of effective algorithmic strategies for accurately detecting movement of affected body parts, more specifically, the movement of the upper limb since it frequently gets affected by stroke episodes – in unconstrained scenarios and; 2) translation of the algorithms to dedicated low-power hardware with an aim of enhancing the battery life of a resource constrained body-worn sensor based remote rehabilitation system for its sustained operation satisfying the notion of long-term continuous monitoring. Following instructions of expert physiotherapists, this work concentrates on detecting three fundamental upper limb movements in unconstrained scenarios: extension/flexion of the forearm; rotation of the forearm about the elbow; and rotation of the arm about the long axis of forearm, using body-worn inertial sensors. After selecting the appropriate type of inertial sensors and their positions through exhaustive experiments, two novel algorithms were proposed to recognize the above mentioned movements: 1) clustering and minimum distance classifier based approach and 2) tracking the orientation of an inertial sensor placed on the wrist. The performances of the algorithms have been evaluated prospectively through an archetypal activity ‘making-a-cup-of-tea’ which includes multiple occurrences of the chosen movements. The proposed clustering based approach detected the three movements with an average accuracy of 88% and 70% using accelerometer data and 83% and 70% using gyroscope data obtained from the wrist for healthy subjects and stroke survivors respectively. Compared to that the proposed sensor orientation based methodology using a wrist-worn accelerometer only recognized the three movements with accuracies in the range of 91-99% for healthy subjects and 70%-85% for stroke survivors. However the clustering based approach provides greater flexibility in terms of incorporating new types of movements apart from the ones chosen here and can also be used to track changes in motor functionality over time. Subsequently it was translated into a novel ASIC resulting in dynamic power consumption of 25.9 mW @20 MHz in 130 nm technology. On the other hand, the sensor orientation based approach was also validated in hardware using an Altera DEII FPGA system, for high speed real-time movement recognition.
... As an example, we consider reducing the number of needed GPS and Ozone level measurements by sharing them through Bluetooth. Indeed, with an average power consumption of less than 20mW [15], Bluetooth is a very good candidate for replacing the pluggable Ozone sensors [226] (more than 400mW according to our measurements) if another device can share it. Designing an energy-efficient sensing protocol for such distributed mobile sensing scenarios is non-trivial and we have the following challenges: ...
Article
With the ever increasing adoption of smartphones worldwide, researchers have found the perfect sensor platform to perform context-based research and to prepare for context-based services to be also deployed for the end-users. However, continuous context sensing imposes a considerable challenge in balancing the energy consumption of the sensors, the accuracy of the recognized context and its latency. After outlining the common characteristics of continuous sensing systems, we present a detailed overview of the state of the art, from sensors sub-systems to context inference algorithms. Then, we present the three main contribution of this thesis. The first approach we present is based on the use of local communications to exchange sensing information with neighboring devices. As proximity, location and environmental information can be obtained from nearby smartphones, we design a protocol for synchronizing the exchanges and fairly distribute the sensing tasks. We show both theoretically and experimentally the reduction in energy needed when the devices can collaborate. The second approach focuses on the way to schedule mobile sensors, optimizing for both the accuracy and energy needs. We formulate the optimal sensing problem as a decision problem and propose a two-tier framework for approximating its solution. The first tier is responsible for segmenting the sensor measurement time series, by fitting various models. The second tier takes care of estimating the optimal sampling, selecting the measurements that contributes the most to the model accuracy. We provide near-optimal heuristics for both tiers and evaluate their performances using environmental sensor data. In the third approach we propose an online algorithm that identifies repeated patterns in time series and produces a compressed symbolic stream. The first symbolic transformation is based on clustering with the raw sensor data. Whereas the next iterations encode repetitive sequences of symbols into new symbols. We define also a metric to evaluate the symbolization methods with regard to their capacity at preserving the systems' states. We also show that the output of symbols can be used directly for various data mining tasks, such as classification or forecasting, without impacting much the accuracy, but greatly reducing the complexity and running time. In addition, we also present an example of application, assessing the user's exposure to air pollutants, which demonstrates the many opportunities to enhance contextual information when fusing sensor data from different sources. On one side we gather fine grained air quality information from mobile sensor deployments and aggregate them with an interpolation model. And, on the other side, we continuously capture the user's context, including location, activity and surrounding air quality. We also present the various models used for fusing all these information in order to produce the exposure estimation.
... The clinical feature extraction and information fusion processes, necessary to attain clinical diagnosis, are computationally intensive tasks and thus are typically carried out in the main-frame computational facilities for prolonging the battery life of the body-worn sensors. However, a long-term sustainable operation of this system is severely affected by the significant energy expenditure required by the radio front-end for supporting continuous data transmission [2]. ...
Conference Paper
This paper presents a wavelet-based low-complexity Electrocardiogram (ECG) compression algorithm for mobile healthcare systems, in the backdrop of real clinical requirements. The proposed method aims at achieving good trade-off between the compression ratio (CR) and the fidelity of the reconstructed signal, to preserve the clinically diagnostic features. Keeping the computational complexity at a minimal level is paramount since the application area we consider is that of remote cardiovascular monitoring, where continuous sensing and processing takes place in low-power, computationally constrained devices. The proposed compression methodology is based on the Discrete Wavelet Transform (DWT). The energy packing efficiency of the DWT coefficients at different resolution levels is analysed and a thresholding policy is applied to select only those coefficients which have significant contribution to the original signal total energy. The proposed methodology is evaluated on normal and abnormal ECG signals extracted from the MIT-BIH database and achieves an average compression ratio of 16.5:1, an average percent root mean square difference of 0.75 and an average cross correlation value of 0.98.
... Research shows that one protocol is better than other protocols in terms of packet size and data rate. 21 Again, since the IoT devices are battery operated, varying the data rate also changes the battery consumption. In the view of the fact that energy conservation is one of the important features, 22 recharging or replacing the battery at a frequent rate is not suggested. ...
Article
With the proliferation of Internet of Things (IoT) and edge computing paradigms, billions of IoT devices are being networked to support data‐driven and real‐time decision making across numerous application domains, including smart homes, smart transport, and smart buildings. These ubiquitously distributed IoT devices send the raw data to their respective edge device (eg, IoT gateways) or the cloud directly. The wide spectrum of possible application use cases make the design and networking of IoT and edge computing layers a very tedious process due to the: (i) complexity and heterogeneity of end‐point networks (eg, Wi‐Fi, 4G, and Bluetooth); (ii) heterogeneity of edge and IoT hardware resources and software stack; (iv) mobility of IoT devices; and (iii) the complex interplay between the IoT and edge layers. Unlike cloud computing, where researchers and developers seeking to test capacity planning, resource selection, network configuration, computation placement, and security management strategies had access to public cloud infrastructure (eg, Amazon and Azure), establishing an IoT and edge computing testbed that offers a high degree of verisimilitude is not only complex, costly, and resource‐intensive but also time‐intensive. Moreover, testing in real IoT and edge computing environments is not feasible due to the high cost and diverse domain knowledge required in order to reason about their diversity, scalability, and usability. To support performance testing and validation of IoT and edge computing configurations and algorithms at scale, simulation frameworks should be developed. Hence, this article proposes a novel simulator IoTSim‐Edge, which captures the behavior of heterogeneous IoT and edge computing infrastructure and allows users to test their infrastructure and framework in an easy and configurable manner. IoTSim‐Edge extends the capability of CloudSim to incorporate the different features of edge and IoT devices. The effectiveness of IoTSim‐Edge is described using three test cases. Results show the varying capability of IoTSim‐Edge in terms of application composition, battery‐oriented modeling, heterogeneous protocols modeling, and mobility modeling along with the resources provisioning for IoT applications.
... Therefore, it is still essential to transmit large numbers of raw datasets through a WSN with a high transmission speed to implement the remote and real-time machine condition monitoring. The relationship between power consumption and transmission rate has been researched, unveiling that the long-term data transmission at a low rate can consume more power than a short-term transmission at a high rate when the effect of transmission distance is excluded [40,41]. Therefore, a wireless network with a higher transmission rate (for example BLE, Wi-Fi and ZigBee) is critical in order to achieve remote machine condition monitoring in real time. ...
Article
Full-text available
Condition monitoring can reduce machine breakdown losses, increase productivity and operation safety, and therefore deliver significant benefits to many industries. The emergence of wireless sensor networks (WSNs) with smart processing ability play an ever-growing role in online condition monitoring of machines. WSNs are cost-effective networking systems for machine condition monitoring. It avoids cable usage and eases system deployment in industry, which leads to significant savings. Powering the nodes is one of the major challenges for a true WSN system, especially when positioned at inaccessible or dangerous locations and in harsh environments. Promising energy harvesting technologies have attracted the attention of engineers because they convert microwatt or milliwatt level power from the environment to implement maintenance-free machine condition monitoring systems with WSNs. The motivation of this review is to investigate the energy sources, stimulate the application of energy harvesting based WSNs, and evaluate the improvement of energy harvesting systems for mechanical condition monitoring. This paper overviews the principles of a number of energy harvesting technologies applicable to industrial machines by investigating the power consumption of WSNs and the potential energy sources in mechanical systems. Many models or prototypes with different features are reviewed, especially in the mechanical field. Energy harvesting technologies are evaluated for further development according to the comparison of their advantages and disadvantages. Finally, a discussion of the challenges and potential future research of energy harvesting systems powering WSNs for machine condition monitoring is made.
... Taking as an example, the ECG signal-the fundamental component of a remote CVD monitoring system-captured at 1-kHz sampling rate with 16-bit quantization. Considering a typical Bluetooth V2 transceiver with 40-55 mA current consumption in transmission mode and a battery capacity of 1200 mAh (the typical batteries used for WSN applications) and following the analysis presented in [2], we conclude that continuous data transmission can be supported only for 24 h. In addition to this, A/D conversion, quantization, and signal preprocessing steps are also carried out at the sensor node and including those factors, it can be argued that the operation of a continuous transmissionbased system, may not be realistically sustainable for more than [8][9][10][11][12] h. ...
Article
Full-text available
This paper introduces a low-complexity algorithm for the extraction of the fiducial points from the Electrocardiogram (ECG). The application area we consider is that of remote cardiovascular monitoring, where continuous sensing and processing takes place in low-power, computationally constrained devices, thus the power consumption and complexity of the processing algorithms should remain at a minimum level. Under this context, we choose to employ the Discrete Wavelet Transform (DWT) with the Haar function being the mother wavelet, as our principal analysis method. From the modulus-maxima analysis on the DWT coefficients, an approximation of the ECG fiducial points is extracted. These initial findings are complimented with a refinement stage, based on the time-domain morphological properties of the ECG, which alleviates the decreased temporal resolution of the DWT. The resulting algorithm is a hybrid scheme of time and frequency domain signal processing. Feature extraction results from 27 ECG signals from QTDB, were tested against manual annotations and used to compare our approach against the state-of-the art ECG delineators. In addition, 450 signals from the 15-lead PTBDB are used to evaluate the obtained performance against the CSE tolerance limits. Our findings indicate that all but one CSE limits are satisfied. This level of performance combined with a complexity analysis, where the upper bound of the proposed algorithm, in terms of arithmetic operations, is calculated as 2:423N + 214 additions and 1:093N + 12 multiplications for N 861 or 2:553N + 102 additions and 1:093N +10 multiplications for N > 861 (N being the number of input samples), reveals that the proposed method achieves an ideal trade-off between computational complexity and performance, a key requirement in remote CVD monitoring systems.
... Further, the average power consumed in the network interface was found to be around 350 mW while using 3G, around 300 mW during Wi-Fi and less than 100 mW while using Bluetooth. A separate study on the energy consumption analysis in the network interface in different traffic scenarios can be found in [11] [14]. This shows the amount of energy that could be potentially saved in the network interface. ...
Article
Full-text available
Over the last couple of years, there has been an exponential increase in the number of applications accessible from various mobile handsets, including Facebook, Twitter, YouTube, etc. In particular, rich media service distribution among smart-phones and other handheld devices is becoming increasingly popular among users. In fact, the next generation wireless technologies have put significant emphasis on supporting distribution of rich media content and video-on-demand services. However, energy consumption in the handheld wireless devices is a major bottleneck that hinders the growth of mobile device based rich media services. The biggest problem today in the mobile world is that the mobile devices are battery driven and the battery technologies are not matching the required energy demand. This paper outlines the major energy-consuming components in handheld devices like smart-phones, PDAs and other multimedia-centric wireless devices. Further, this paper surveys different research works on how the energy consumption could be optimized and provides detailed discussions on the latest energy saving techniques in the major components of the mobile devices. In addition, the paper surveys other systemic energy optimization techniques so that the overall battery life of the device is increased. Major global research projects and their research focus are then surveyed. Finally, a brief summary is provided along with some open research problems and different possible future research directions.
... Another approach to analyze the energy consumption of mobile devices is from the applications perspective. In this context, some works [13,14] have carried out measures on the consumption related to the use of applications such as Bluetooth and SMS service. This experiment was performed 15 times to each state in a closed environment, using the Bluetooth 2.0 technology. ...
... Another approach to analyze the energy consumption of mobile devices is from the applications perspective. In this context, some works [9] have carried out measures on the consumption related to the use of applications such as Bluetooth, SMS and email service. Table I shows the consumption of a mobile device in different states regarding the use of Bluetooth. ...
Conference Paper
Full-text available
Recent advances in computing and communication technology have led to increased use of mobile devices. A trend in this area is the integration of different applications in a single general-purpose device, often resulting in much higher energy consumption and consequently much reduced battery life. This higher consumption brings up problems for the mobile evolution, once developers cannot use all the potential of the current technology. Furthermore, a higher consumption is also against the current trend of green computing, which tries to create more energy saving and eco-friendly computational systems. This paper discusses our efforts in creating techniques to optimize the energy consumption of mobiles and also creating a certification program that can verify if a mobile could be considered "green". Results and details about our literature review, laboratory setup, experiments, list of good practices and others are given along this paper.
... Research shows that one protocol is better than other protocols in terms of packet size and data rate. 21 Again, since the IoT devices are battery operated, varying the data rate also changes the battery consumption. In the view of the fact that energy conservation is one of the important features, 22 recharging or replacing the battery at a frequent rate is not suggested. ...
Preprint
This paper proposes a novel simulator IoTSim-Edge, which captures the behavior of heterogeneous IoT and edge computing infrastructure and allows users to test their infrastructure and framework in an easy and configurable manner. IoTSim-Edge extends the capability of CloudSim to incorporate the different features of edge and IoT devices. The effectiveness of IoTSim-Edge is described using three test cases. The results show the varying capability of IoTSim-Edge in terms of application composition, battery-oriented modeling, heterogeneous protocols modeling and mobility modeling along with the resources provisioning for IoT applications.
Article
Making new connections according to personal preferences is a crucial service in mobile social networking, where an initiating user can find matching users within physical proximity of him/her. In existing systems for such services, usually all the users directly publish their complete profiles for others to search. However, in many applications, the users' personal profiles may contain sensitive information that they do not want to make public. In this paper, we propose FindU, a set of privacy-preserving profile matching schemes for proximity-based mobile social networks. In FindU, an initiating user can find from a group of users the one whose profile best matches with his/her; to limit the risk of privacy exposure, only necessary and minimal information about the private attributes of the participating users is exchanged. Two increasing levels of user privacy are defined, with decreasing amounts of revealed profile information. Leveraging secure multi-party computation (SMC) techniques, we propose novel protocols that realize each of the user privacy levels, which can also be personalized by the users. We provide formal security proofs and performance evaluation on our schemes, and show their advantages in both security and efficiency over state-of-the-art schemes.
Article
This paper presents a comprehensive overview on the area of energy-aware common content distribution over wireless networks with mobile-to-mobile cooperation. It is assumed that a number of mobile terminals (MTs) that are geographically close to each other are interested in downloading the same content from a server via a base station using a long-range wireless technology. Selected MTs download the content directly from the base station and transmit it to other MTs using a short-range wireless technology. This cooperation can lead to significant performance gains since short-range wireless technologies are energy efficient and provide higher data rates due to the geographical proximity among the MTs. In this paper, we highlight the main alternatives that shape the design of cooperative content distribution architectures with focus on energy efficiency. These include content segmentation, long-range and short-range distribution strategies, grouping of the MTs into cooperating clusters, single hop and multihop communications among the MTs, resource allocation, fairness considerations, and network dynamics. We also discuss various methods commonly utilized for developing content distribution algorithms and evaluating network performance. Finally, we present sample results for selected network scenarios, discuss related standardization activities, and highlight future research directions.
Article
Actual home networks have become an heterogeneous environment in terms of technologies used and do not take into account any intelligent energy saving mechanism. In this paper we propose anenergy saving strategy by integrating a Wireless Sensor Network (WSN) with a high speed, hybrid home network. The main objective is to demonstrate that a WSN can act as a dependable control plane, where low data rate control packets can flow even when the high speed home network is shut down. While the home network nodes or some of each node's network technologies can be deactivate, the WSN is always on and, due to the low data rate required to properly work, it consumes a very limited quantity of energy. This mutual interaction between the high speed home network and the WSN allows at the same time to achiveve the convergence of heterogeneouscommunication technologies in the home environment and leads to a substantial reduction of the energy consumptions. Simulation results show that this strategy is effective in a multitude of scenarios and provides a tangible economical benefit.
Conference Paper
The paper presents a comparison between various approaches to the optimized operation of a Content Centric Residential Community Network (CCRCN). The evaluation basis is mostly about the energy footprint of CCRCNs, but other factors, such as content offloading and Internet Sharing, are also taken into account. From the analysis it emerges that the current approaches to operate a CCRCN are not very profitable in terms of energy. Some produce very marginal savings, whereas the behaviour of others is strongly dependent on the operational conditions, thus being scarcely predictable. We thus point out the strengths and weaknesses of each, thus paving the way for the future greening strategies of CCRCNs.
Conference Paper
The set of services, provided by the mobile phone platform, is becoming increasingly complex and requiring more computational power, hence higher energy consumption and compromising the autonomy of these devices. Thus, it is important to identify scenarios where we could apply methods to decrease such consumption and extend the mobile autonomy. This work presents practical results of an experimental study that evaluates four scenarios. For these scenarios, a control threshold was defined and, after that, important variables were modified so that we could analyze the impact of these variables in the energy consumption. From this analysis we could identify opportunities to develop new methods for saving energy, as the examples listed in this paper.
Conference Paper
In this paper first we present an overview of the state-of-the-art remote patient monitoring systems in the backdrop of real clinical needs. The paper establishes a clear guideline in terms of clinical expectations from such a system from the viewpoint of practicing clinicians. It provides in-depth analysis of the shortcomings of the existing architectures and paves a way towards developing a practical “patient-centric” architecture that could be useful in the day-to-day clinical practice for providing “continuum of care”. Subsequently, the restrictions imposed by the resource constrained nature of such a system on development of appropriate hardware for supporting information processing on the data acquired by body-worn sensors are analyzed.
Article
Mobile devices such as personal digital assistants (PDAs) and smartphones are widely used not only in our everyday lives but also in various industrial fields. Most of these mobile devices have multiple wireless network interfaces, such as Bluetooth, 3G, and Wi-Fi. A considerable amount of energy is consumed to transfer the data through wireless communication. Moreover, most of these mobile devices operate on limited battery power. In industrial environments, changes in the communication environment are severe due to significant noise sources and due to distortion in the transceiver circuitry of strong motors, static frequency changers, electrical discharge devices, and other devices. It is necessary to select the network interface efficiently in order to extend the lifetimes of mobile devices and their applications. Therefore, in this paper, we propose an energy-efficient adaptive wireless network interface-selection scheme (AWNIS). Our scheme is proposed based on the mathematical modeling of energy consumption and data transfer delay patterns. Our scheme selects the best wireless network interface in terms of energy consumption by considering the link quality and adapting a dynamic network interface-selection interval according to the network environment. The simulation results show that proposed scheme effectively improves the energy efficiency while guaranteeing a certain level of data transfer delay.
Article
The subject of this article is to bring the problem analysis of vehicle communication through diagnostic interface OBDII with diagnostic tools used in automotive diagnostics with an emphasis on Bluetooth communication. Based on the analysis, this paper deals with the description of software design and its implementation to mobile devices on purpose of communication with CAN OBDII interface. For this type of communication is usually used diagnostically-oriented communication processor ELM 327. Aim of this solution is to bring an application provided for mobile devices with Android operating system, which can obtain real time data from engine control unit through ELM 327.
Article
The increasing demand for large data downloads on cellular networks is increasing congestion which decreases end user quality of service. This work addresses the problem of offloading the cellular network while distributing common content to a group of mobile devices that cooperate during the download process by forming device-to-device communication networks. The base station unicasts different chunks of the content to selected mobile devices that multicast it to each other over local ad hoc networks using multihop cooperation while maintaining fairness constraints on the energy consumption of the mobile devices. The optimal cellular offloading problem is formulated as a mixed integer linear programming problem and the corresponding complexity is analyzed. Then, a dynamic programming approach is proposed to adapt the solution to the dynamics of the network as the mobile devices move. Cellular offloading assuming single hop cooperation among the mobile devices proves to be significantly less computationally complex than cooperation using a higher number of hops; however both problems are NP-complete. Thus, polynomial time greedy algorithms are presented to obtain computationally fast solutions with good performance. Performance results demonstrate that significant cellular offloading gains can be achieved, even if only a very small fraction of the mobile devices' battery levels can be consumed for cooperation.
Chapter
Full-text available
The availability of effective communications in post-disaster scenarios is key to implement emergency networks that enable the sharing of critical information and support the coordination of the emergency response. To deliver those levels of QoS suitable to these applications, it is vital to exploit the multiple communication opportunities made available by the progressive deployment of the 5G and Smart City paradigms, ranging from ad-hoc networks among smartphones and surviving IoT devices, to cellular networks but also drone-based and vehicle-based wireless access networks. Therefore, the user device should be able to opportunistically select the most convenient among them to satisfy the demands for QoS imposed by the applications and also minimize the power consumption. The driving idea of this paper is to leverage non-cooperative game theory to design such an opportunistic user association strategy in a post-disaster scenario using UAV ad-hoc networks. The adaptive game-theoretic scheme allows increasing of the QoS of the communication means by lowering the loss rate and also keeps moderate the energy consumption.
Article
The use of mobile devices has increased many folds over the last few years. Smart phones are used not only for communication but also for storage of personal contents like photos, videos, documents, and bank account and credit/debit card details. Secure access to these contents is very important. Nowadays, many mobile devices come with inbuilt biometric-enabled security features like fingerprint, face recognition, and iris. However, additional battery power is consumed each time a user unlocks the device using any one of these security features. In order to enable prolonged use of the device, there is a strong need to find ways to conserve power in mobile devices. At the same time, it is also equally important that the smart phone user knows how long the battery of his device will last. In this paper, we present a novel power optimization and battery lifetime prediction framework called P4O (Pattern, Profiling, Prediction, and Power Optimization). Our contributions are threefold—(i) Propose a novel framework for power optimization in smart phones. (ii) Propose a new approach for battery lifetime forecast. (iii) Implement and validate the efficacy of the proposed framework. For experimental results, the proposed framework was implemented on the Android-based smartphone. The experimental results validate the proposed framework with power optimization up to 40% over default Linux and Android power saving features available in an Android operating system. This framework is also able to forecast battery lifetime with accuracy of up to 98%.
Article
A method for electrocardiogram (ECG) feature extraction is presented for automatic classification of heartbeats, using values of RR intervals, amplitude and Hjorth parameters. Hjorth parameters have been used in a variety of research areas, but their application to ECG signal processing is still little explored. This paper also introduces a new approach to heartbeat segmentation, which avoids mixing information from adjacent beats and improves classification performance. The proposed model is validated in the Massachusetts Institute of Technology - Beth Israel Hospital (MIT-BIH) Arrhythmia database and presents an overall accuracy of 90.4%, better than other state-of-the-art methods. There is an improvement over other models in positive predictivity for class S (66.6%) of supraventricular ectopic beats, and sensitivity for class N (93.0%). Results obtained indicate that the techniques used in this study can be successfully applied to the problem of automatic heartbeat classification. In addition, this new approach has low computational cost, which allows its later implementation in hardware devices with limited resources.
Article
Full-text available
Diabetic retinopathy is the basic reason for visual deficiency. This paper introduces a programmed strategy to identify and dispense with the blood vessels. The location of the blood vessels is the fundamental stride in the discovery of diabetic retinopathy because the blood vessels are the typical elements of the retinal picture. The location of the blood vessels can help the ophthalmologists to recognize the sicknesses prior and quicker. The blood vessels recognized and wiped out by utilizing Gobar filter on two freely accessible retinal databases which are STARE and DRIVE. The exactness of segmentation calculation is assessed quantitatively by contrasting the physically sectioned pictures and the comparing yield pictures, the Gabor filter with Entropic threshold vessel pixel segmentation by Entropic thresholding is better vessels with less false positive portion rate.
Conference Paper
We demonstrate a prototype that takes advantage of open-source software to put a full-text searchable copy of Wikipedia on a Raspberry Pi, providing nearby devices access to content via wifi or bluetooth without requiring internet connectivity. This short paper articulates the advantages of such a form factor and provides an evaluation of browsing and search capabilities. We believe that personal digital libraries on lightweight mobile computing devices represent an interesting research direction to pursue.
Article
In this paper we focus on a Residential Community Network (RCN) in which the various home gateways have (limited) caching capabilities, can be shared among the users, and can exploit multi-hop wireless paths to communicate with each other. We analyse such a RCN from an energy-aware perspective, investigating its energy-saving potential in delivering the contents to the end users. To this aim, we define a resource allocation and routing scheme and, by means of an integer linear programming model, we solve to optimality the problem of associating the user terminals to the home gateways and computing the data flow paths. Accordingly, the resulting allocation pattern can serve as an upper bound for the global “greening” capability of the RCN. A comparison between our method and other techniques for resource sharing in RCNs sheds light on the pros and cons of both our scheme and the other approaches.
Article
A Distributed Virtual Environment (DVE) system provides a shared virtual environment where physically separated users can interact and collaborate over a computer network. There are three major challenges to improve DVE scalability: effective DVE system performance measurement, understanding the controlling factors of system performance/quality and determining the consequences of DVE system changes. We describe a DVE Scalability Engineering (DSE) process that addresses these three major challenges for DVE design. The DSE process allows us to identify, evaluate, and leverage trade-offs among DVE resources, the DVE software, and the virtual environment. We integrate our load simulation and modeling method into a single process to explore the effects of changes in DVE resources.
Conference Paper
Home networks have become heterogeneous environment hosting a variety of wireless and wired telecommunication technologies. Currently there no exist any intelligent energy saving mechanism to control the home networks. In this paper we propose an energy-aware strategy that integrate Wireless Sensor Network (WSN) with a convergent digital home network. We aim to demonstrate that a WSN can act as a dependable control plane to manage the high speed home network. While the home network nodes can be deactivated, the WSN is always on and, due to the low data rate required to properly work, it consumes a very limited quantity of energy. This mutual interaction leads to a substantial reduction of the energy consumptions. Simulation results show that this strategy is effective in different scenarios and provides a tangible economic benefit.
Article
A large number of new data consuming applications are emerging in the daily routines of mobile users. Device-to-Device (D2D) communication as a new paradigm is introduced to reduce the increasing traffic and offload it to the user equipment (UE). With the development of UE multi-radio interface, we first develop a new hybrid architecture concept for D2D communication. The architecture combines ISM 2.4G spectrum as the Out-Band mode using Bluetooth and Wifi-Direct with the cellular spectrum as the In-Band mode. Secondly, we design a scheme that forms the Out-Band cluster and makes the following periodic signaling interaction via the Bluetooth interface. Traffic is transferred via the Wifi-Direct interface inside the cluster but carried on the cellular spectrum among the clusters. Simulation results show that our proposal increases the system throughput, saves power consumption and prolongs the clusters lifetime.
Article
This paper addresses the problem of energy-aware multihop cooperation among the mobile terminals (MTs) that cooperatively download a common content from a wireless network. The base station (BS) unicasts or multicasts the content to selected MTs that, in turn, either unicast or multicast it to other MTs, forming a multihop ad hoc network with a predefined maximum allowed number of hops. This paper presents the optimization formulations whose solution gives the exact optimal set of receiving MTs from the BS, the optimal multihop ad hoc network, and the optimal unicasting and multicasting transmission bit rates that minimize the total energy consumption of the MTs. Second, a simplified multicasting formulation is proposed that has a close-to-optimal performance with notably lower computational complexity. Third, interference avoidance among the transmitting MTs is considered. For each presented formulation, the complexity is identified, and results show that some formulations can be efficiently solved for medium network sizes, while others are more computationally complex. Thus, polynomial-time heuristic solutions are presented that have close-to-optimal performance. Results demonstrate remarkable energy consumption reduction gains and wireless resources savings under various network scenarios.
In military action, marching is a common method used for supply-troop movement. Supply routes are typically in the wilderness where the route conditions change over time. This paper proposes a power-saving algorithm allowing supply troops to collect route information using wireless sensor network technology. Each member in the marching supply troop is equipped with a battery-powered sensor. To save power consumption, the proposed methods schedule the sleeping period for each member according to the size of the marching supply troop and its moving velocity. Two data carrying methods are proposed to reduce the frequency of long-distance data uploading. The first method allows the uploaded data to be carried within a single-round data collection period, and the second method extends the data carrying period to multiple rounds. The simulation results show that scheduling a sleep period can prolong the sensing distance along the route. These two proposed methods can add an additional 18–70% in distance data over methods without scheduling a sleep period. The energy spent on long-distance data transmissions can be improved by 7–25%. Copyright © 2015 John Wiley & Sons, Ltd.
Article
In the last decade, one of the main goals in wireless telecommunications has been to reduce energy consumption of mobile devices. However, making a network device green can cause performance deterioration. The target of this paper is to propose a cross-layer approach for the design of a mobile video cloud for the uplink transmission towards the Internet. The proposed approach is adaptive in both the video sources and the wireless transmitter. A source Rate Controller is applied to compensate transmission bandwidth reduction due to the energy saving policies. Energy saving in wireless transmission on the mobile cloud cellular channel is achieved by introducing an energy-efficient ARQ protocol. This protocol can apply different transmission laws, in order to exploit the correlation of the cellular channel behavior. An analytical model of the system is defined to compare the transmission laws, and provide some design guidelines to choose one of them and design its parameters.
Conference Paper
Multi-Sensory mobile health monitoring systems promise substantial improvements in the quality of healthcare. However, large-scale trials are uncovering key areas that inhibit long-term large-scale deployments, including power consumption and lifetime issues, and high communication overhead. Traditional techniques can efficiently resolve these issues while maintaining semantic fidelity of the sensed medical signal, but also amplify the signal's sensitivity to sensor faults, thereby reducing system safety. We propose a set of statistical techniques to optimize system power and bandwidth consumption, while adhering to signal fidelity and sensor fault diagnosis requirements. By defining signal fidelity in terms of its semantic value, and formulating the problem as a sensor subset selection wherein mutual information rather than aggregate signal quality is maximized, we show that power consumption in a wireless human gait monitoring system can be reduced by up to 78% while accurately estimating many functional gait assessment metrics and precisely diagnosing semantic faults.
Conference Paper
Ad-Hoc networking is one of the most impacting architectures in the development of the Future Internet, where the nodes are expected to be wireless and mobile. Unlike the traditional IP-based and simulation-based approaches inside the Ad-Hoc research field, this work presents a real implementation of proactive (OLSR) and reactive protocols (AODV) over real Bluetooth devices (non-IP approach) in static and dynamic scenarios. Along the paper we point out many handicaps that have been solved in order to make Bluetooth work as supporting technology (the need for a previously established connection to, the lack of broadcast messages, the overhead...). We have carried out a set of experiments to compare the potential and limitations of both protocols. This work was developed inside the Future Internet project supported by the Basque Government within the ETORTEK Programme and the FuSeN project supported by the Spanish Ministerio de Ciencia e Innovación.
ResearchGate has not been able to resolve any references for this publication.