Article

Energy Consumption Analysis for Bluetooth, WiFi and Cellular Networks

Authors:
To read the full-text of this research, you can request a copy directly from the author.

Abstract

This document analyzes average energy consumption of Bluetooth, WiFi (802.11) and cellular networks for transmitting data produced at f bytes per second. It is assumed that a packet is created every t buf seconds and sent to the respective module for transmission. Thus, data produced by an application in t buf is given by d = t buf * f bytes, neglecting packet overhead. The different energy and current values used in this report are either taken from data sheets, published papers, or provided by the vendor (in case of Bluetooth).

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the author.

... Current applications that are dealing with real world challenges are generating a huge amount of data and required tremendous computing resources. The exploitation of cloud advanced services for HPC and data intensive application (e.g., big data application) is a common practice that is supported by both academia and industry (Quwaider and Jararweh, 2013, 2014, 2015. ...
... The following are the trades between using these two communication technologies. A WBAN user can transmit the data packet to the cloud with low power and low delay using Wi-Fi technology compared with the 3G or the LTE cellular communication (Balani, 2007;Zhang et al., 2013), but with transmission range with Wi-Fi does not exceed 100 m (Joseph et al., 2004). The capability of Wi-Fi is essential to have efficient power consumption in WBAN sensors and transmitting data to the cloud system in a successful manner. ...
... The Wi-Fi technology in our implementation should be available in the cloudlet area. Via Wi-Fi, it was shown that the transmission power of a data packet of size 46 Bytes will cost about 30 mw Balani, 2007;Dementyev et al., 2013) and with data packet delay of 0.045 ms. On the other hand, cellular communication technologies, like 3G or LTE has longer transmission rage and the user is capable to transmit the data packet to the enterprise cloud from any position that is cover by cellular network. ...
Article
Full-text available
This paper presents an efficient large-scale data collection in wireless body area network (WBAN) in the presence of cloudlet-based prototype system. The key contribution of this paper is to collect the observed data of WBANs in a large-scale and convey it in consistent manner to the other end of service providers. A model of WBANs is proposed in this work including virtualised machines and cloudlet in order to characterise the efficient WBANs data collection. A scalable storage and processing infrastructure is proposed to support large-scale WBANs system, which is efficiently capable to handle the big data generated by large number of WBANs users. The proposed model supports effective cost communication technologies through Wi-Fi technology. Performance results of the proposed prototype are evaluated using advanced CloudSim simulator. The performance results show that the consumed power and packet delay of the collected data are decreased by increasing the number of virtualised machines and cloudlets in the monitored area. The results show also that the performance depends on the method of the virtualised cloudlet distribution in the target area for a given number of users.
... The differences between using these two communication technologies are the followings. A WBAN user with WiFi, is able to transmit the data to the cloud with low delay and low power compared with cellular network technology [41] , but with limited transmission range of 100 m [42] . The successful transmitting data to the cloud service is supported by WiFi capability due to supporting the power constrained in WBAN sensors. ...
... In our prototype system, the WiFi technology is available by cloudlet in the covered area. Via WiFi, it was shown that, the transmission delay of 46 Bytes data packet will take roughly 0.045 ms [41,43] and with a power cost of 30 mw. On the other hand, a cellular network connection (e.g. ...
... On the other hand, a cellular network connection (e.g. 3G, 4G and LTE) of longer transmission range, it is capable of sending the data packet to the cloud service from any location that is covered by cellular technology, which is geographically wider in area compared with the covered area of WiFi technology [41,43] . Via cellular technology, it was shown that the transmission delay of 46 Bytes data packet will take roughly 0.45 ms and with a power cost of 300 mw [41,43] . ...
... The following are the trades between using these two communication technologies. A BAN user can transmit the data packet to the cloud with low power and low delay using Wi-Fi technology compared with the 3G or the LTE cellular communication [19], but with transmission range with Wi-Fi does not exceed 100m [20]. The capability of Wi-Fi is essential to have efficient power consumption in BAN sensors and transmitting data to the cloud system in a successful manner. ...
... The Wi-Fi technology in our implementation should be available in the cloudlet area. Via Wi-Fi, it was shown that the transmission power of a data packet of size 46 Bytes will cost about 30 mw [7], [19], [21] and with data packet delay of 0.045 ms. On the other hand, cellular communication technologies, like 3G or LTE has longer transmission rage and the user is capable to transmit the data packet to the enterprise cloud from any position that is cover by cellular network. ...
... Therefore, cellular communication technology has usually wider geographic area compared with the Wi-Fi technology. While, the transmission power of data packet of size 46 Bytes and using Wi-Fi technology will cost 300 mw and with a delay of 0.45 ms [19], [21]. Therefore, the cellular communication is very expensive in terms of power and delay compared with Wi-Fi, which is mostly free of charge. ...
... The trades between using these two technologies are the followings. With WiFi, a WBAN user will be able to transmit the data packet to the cloud with low power and low delay compared with cellular technology [26], but with transmission range does not exceed 100 m [27]. Such WiFi capability is crucial to support the power constrained in WBAN sensors while successfully transmitting data to the cloud system. ...
... It was shown that, via WiFi technology the transmission power and delay are 10 times less compared with transmission using cellular technology (e.g. 3G and LTE) with same packet size [26,28,29]. While the cellular connection is very costly in terms of power, transmission delay and connection cost (the WiFi is mostly free of charge), it is very important to support WBAN users mobility in case of the absence of the cloudlet in the range and to support the scalability of the system under large number of WBAN users [8,9,30], as we will discuss in Section 4. ...
... Each experiment in these results lasted for 3600 s in an area of 400 × 600 m. 400 human subjects are moving in the area with random way point mobility with speed of 2 m/s and a random pause time of 1-10 s and each user sends a packet of size 46 Bytes to the cloud with a period of 10 s. Remember, it was shown that, via WiFi, the transmission power of a data packet of size 46 Bytes will cost about 30 mW [26,28,29] and with a delay of 0.045 ms. While, via cellular network connection (e.g. ...
... The trade between using these two technologies is the following. With WiFi, a BAN user will be able to transmit the data packet to the cloud using smart phone with low power and low delay compared with cellular technology, but with transmission range does not exceed 100m 14 . Such WiFi capability is crucial to overcome the power constraint in mobile environment while successfully transmitting data to the cloud system. ...
... In our implementation, the WiFi technology will be available in the cloudlet area. It was shown that, via WiFi, the transmission power of data packet will cost 30 mw with delay of 0.045 ms 14 . On the other hand, a longer transmission range cellular network connection (e.g. ...
... 3G and LTE) is capable of transmitting the data packet to the cloud from any location that is covered by cellular network, which is usually a wider geographic area compared with the WiFi. It was shown that, via cellular connection, the transmission power of data packet will cost 300 mw with delay of 0.45 ms 14 . While the cellular connection is very costly in terms of power, transmission delay and connection cost (the WiFi is mostly free of charge), it's very important to support mobile users in case of the absence of the cloudlet in the range and to support the scalability of the system under large number of users. ...
Conference Paper
Mobile cloud computing is an emerging and fast-growing computing paradigm that has gained great interest from both industry and academia. Consequently, many researchers are actively involved in cloud computing research projects. One major challenge facing mobile cloud computing researchers is the lack of a comprehensive experimental framework to use in their experiments and to evaluate their proposed work. This paper introduces a modeling and simulation environment for mobile cloud computing. The experimental framework can be used to evaluate a wide spectrum of mobile cloud components such as processing elements, storage, networking, applications, etc. The framework is built on top of the CloudExp framework which provides the major building blocks needed for any cloud system. Moreover, mobile cloud experimental framework can exploit CloudExp capabilities to simulate big data generation and processing scenarios. An experimental scenario is also introduced in this paper to demonstrate the capabilities of the proposed framework.
... The following are the trades between using these two communication technologies. A WBAN user can transmit the data packet to the cloud with low power and low delay using Wi-Fi technology compared with the 3G or the LTE cellular communication [25], [26], but with transmission range with Wi-Fi does not exceed 100m [27]. The capability of Wi-Fi is essential to have efficient power consumption in WBAN sensors and transmitting data to the cloud system in a successful manner. ...
... The Wi-Fi technology in our implementation should be available in the cloudlet area. Via Wi-Fi, it was shown that the transmission power of a data packet of size 46 Bytes will cost about 30 mw [7], [25], [28] and with data packet delay of 0.045 ms. On the other hand, cellular communication technologies, like 3G or LTE has longer transmission rage and the user is capable to transmit the data packet to the enterprise cloud from any position that is cover by cellular network. ...
... Therefore, cellular communication technology has usually wider geographic area compared with the Wi-Fi technology. While, the transmission power of data packet of size 46 Bytes and using Wi-Fi technology will cost 300 mw and with a delay of 0.45 ms [25], [28]. Therefore, the cellular communication is very expensive in terms of power and delay compared with Wi-Fi, which is mostly free of charge. ...
Article
This paper presents an efficient large scale data collection in Wireless Body Area Network (WBANs) in the presence of cloudlet-based prototype system. The key contribution of this paper is to collect the observed data of WBANs in a large scale and convey it in consistent manner to the other end of service providers. A model of WBANs is proposed in this work including virtualized machines and Cloudlet in order to characterize the efficient WBANs data collection. A scalable storage and processing infrastructure have been proposed to support large scale WBANs system, which is efficiently capable to handle the big data generated by large number of WBANs users. The proposed model supports effective cost communication technologies through Wi-Fi technology. Performance results of the proposed prototype are evaluated using advanced CloudSim simulator. The performance results show that the consumed power and packet delay of the collected data is decreased by increasing the number virtualized machine and cloudlets in the monitored area. The results show also that the performance depends on the way of the virtualized cloudlet distribution in the target area for a given number of users.
... The following are the trades between using these two communication technologies. A BAN user can transmit the data packet to the cloud with low power and low delay using Wi-Fi technology compared with the 3G or the LTE cellular communication [19], but with transmission range with Wi-Fi does not exceed 100m [20]. The capability of Wi-Fi is essential to have efficient power consumption in BAN sensors and transmitting data to the cloud system in a successful manner. ...
... The Wi-Fi technology in our implementation should be available in the cloudlet area. Via Wi-Fi, it was shown that the transmission power of a data packet of size 46 Bytes will cost about 30 mw [7], [19], [21] and with data packet delay of 0.045 ms. On the other hand, cellular communication technologies, like 3G or LTE has longer transmission rage and the user is capable to transmit the data packet to the enterprise cloud from any position that is cover by cellular network. ...
... Therefore, cellular communication technology has usually wider geographic area compared with the Wi-Fi technology. While, the transmission power of data packet of size 46 Bytes and using Wi-Fi technology will cost 300 mw and with a delay of 0.45 ms [19], [21]. Therefore, the cellular communication is very expensive in terms of power and delay compared with Wi-Fi, which is mostly free of charge. ...
Conference Paper
In this paper we present an efficient big data collection model in Body Area Network (BANs) using cloudlet-based system prototype. The novelty of the proposed work is to have the monitored data of BANs in a large scale and deliver it in reliable manner to the service providers. A prototype of BANs is proposed in this paper to include virtualized machines and Cloudlet in order to characterize the efficient BAN data collection. A scalable storage and processing infrastructure have been proposed to support large scale BANs system, which is efficiently capable to handle the big data generated by BANs users. The model supports effective cost communication technologies through Wi-Fi technology. Performance results of the proposed prototype are evaluated using advanced CloudSim simulator. The performance results show the consumed power and packet delay of the collected data is decreased by increasing the number virtualized machine and Cloudlets.
... The trades between using these two technologies are the followings. With WiFi, a WBAN user will be able to transmit the data packet to the cloud with low power and low delay compared with cellular technology [29], but with transmission range does not exceed 100 m [30]. Such WiFi capability is crucial to support the power constrained in WBAN sensors while successfully transmitting data to the cloud system. ...
... In our implementation, the WiFi technology will be available in the cloudlet area. It was shown that, via WiFi, the transmission power of a data packet of size 46 Bytes will cost about 30 mw [29,31,32] and with a delay of 0.045 ms. On the other hand, a longer transmission range cellular network connection (e.g. ...
... 3G and LTE) is capable of transmitting the data packet to the cloud from any location that is cover by cellular network, which is usually a wider geographic area compared with the WiFi. It was shown that, via cellular, the transmission power of data packet of size 46 Bytes will cost about 300 mw and with a delay of 0.45 ms [29,31,32]. While the cellular connection is very costly in terms of power, transmission delay and connection cost (the WiFi is mostly free of charge), it's very important to support WBAN users mobility in case of the absence of the cloudlet in the range and to support the scalability of the system under large number of WBAN users, as we will discuss in Section 4. ...
Article
Wireless Body Area networks (WBANs) have developed as an effective solution for a wide range of healthcare military and sports applications. Most of the proposed works studied efficient data collection from individual and traditional WBANs. Cloud computing is a new computing model that is continuously evolving and spreading. This paper presents a novel cloudlet-based efficient data collection prototype system in WBANs. The goal is to have a large scale of monitored data of WBANs to be available at the end user or to the service provider in reliable manner. A prototype of WBANs, including Virtualized Machine (VM) and Virtualized Cloudlet (VC) has been proposed for simulation characterizing efficient data collection in WBANs. Using the prototype system, we provide a scalable storage and processing infrastructure for large scale WBANs system. This infrastructure will be efficiently able to handle the large size of data generated by the WBANs system, by storing these data and performing analysis operations on it. The proposed model is fully supporting for WBANs system mobility using cost effective communication technologies of WiFi and cellular which are supported by WBANs and VC systems. This is in contrast of many of available mHealth solutions that is limited for high cost communication technology, such as 3G and LTE. Performance of the proposed prototype is evaluated via an extended version of CloudSim simulator. It is shown that the average power consumption and delay of the collected data is tremendously decreased by increasing the number of VMs and VCs.
... The trade between using these two technologies are the followings. With WiFi, a BAN user will be able to transmit the data packet to the cloud using smart phone with low power and low delay compared with cellular technology, but with transmission range does not exceed 100m [17]. Such WiFi capability is crucial to support the power constrain in BAN sensors while successfully transmitting data to the cloud system. ...
... In our implementation, the WiFi technology will be available in the cloudlet area. It was shown that, via WiFi, the transmission power of data packet will cost 30 mw with delay of 0.045 ms [17]. On the other hand, a longer transmission range cellular network connection (e.g. ...
... 3G and LTE) is capable of transmitting the data packet to the cloud from any location that is cover by cellular network, which is usually a wider geographic area compared with the WiFi. It was shown that, via cellular, the transmission power of data packet will cost 300 mw with delay of 0.45 ms [17]. While the cellular connection is very costly in terms of power, transmission delay and connection cost (the WiFi is mostly free of charge), it's very important to support BAN users mobility in case of the absence of the cloudlet in the range and to support the scalability of the system under large number of BAN users. ...
Conference Paper
This paper presents a large scale BANs system in the presence of cloudlet-based data collection. The objective is to minimize end-to-end packet cost by dynamically choosing data collection to the cloud using cloudlet based system. The goal is to have the monitored data of BANs to be available to the end user or to the service provider in reliable manner. While reducing packet-to-cloud energy, the proposed work also attempts to minimize the end-to-end packet delay by choosing dynamically a neighbor cloudlet, so that the overall delay is minimized. Then, it will lead to have the monitored data in the cloud in real time manner. Note that, in the absence of network congestions in low data-rate BANs, the storage delays due to data collection manner are usually much larger compared to the congestion delay.
... Although each application has its own specific requirements, a large class requires a similar set of features such as long-range wireless connectivity, low energy consumption and cost effectiveness [1] while only transmitting a relatively small amount of data. Even though conventional cellular standards (2G, 3G, 4G) have been designed to provide global coverage, they consume too much energy for battery-powered devices [2]. The introduction of LPWANs has enabled developers to opt for a cost effective low-power connectivity technology, whilst still enabling long-range communication. ...
... When connecting to the network, the modem negotiates several network parameters (e.g., CE level, timer values, etc.). After the node has successfully joined the network, the package is sent (2). In this example, five bytes of data encapsulated in a User Datagram Protocol (UDP) packet, resulting in a total NB-IoT-payload of 23 B, is sent. ...
Preprint
Full-text available
The broad range of requirements of Internet of Things applications has lead to the development of several dedicated communication technologies, each tailored to meet a specific feature set. A solution combining different wireless technologies in one device, can overcome the disadvantages of any individual technology. The design of such Multiple Radio Access Technology solutions based on the diverse characteristics of the technologies offers interesting opportunities. In this work we analyze the potential of combining LoRaWAN and NB-IoT in a Multi-RAT solution for IoT. To that end we evaluate key IoT node requirements in function of payload size and link quality: (1) energy efficiency, (2) coverage, (3) payload size, (4) latency performance, (5) Quality of Service, and (6) cost efficiency. Our theoretical assessment and experimental validation of these IoT features show the merits of a Multi-RAT solution. Notably, energy consumption in use cases with only sporadic large payload requirements, can be improved by a factor of at least 4 with respect to either single-mode technologies. Moreover, latency-critical messages can get delivered on time and coverage can be extended elegantly where needed.
... These available resources and services can be shared directly among the users in an elastic and on-demand way without time-consuming and energy-requiring interactions with preexisting infrastructure [15], for example, cellular networks in a network level and mobile computing cloud in a computing/ service level. Note that mobile applications, e.g., Tensor-Flow Lite, mobile video editor, and Online video sharing, usually require a great amount of data transfer by using the D2D communications, such as Bluetooth, Wi-Fi, and NFC [16]. D2D communications are featured with increased user throughput and network coverage but reduced cellular traffic and energy consumption in comparison with the traditional cellular networks. ...
... The motion influenced by other individuals refers to the learning from neighboring individuals with different composition schedules, i.e., (15) where α i = α target + α local (16) α i is the direction of the influenced motion estimated by target swarm density (denoted as α target ) and local swarm density (denoted as α local ), N max is the maximum influenced speed, ω n ∈ [0, 1] is the inertia weight of the influenced motion, and N old i is the last influenced motion. 2) Foraging Motion: Similarly, the movement caused by foraging F i refers to the learning from the individual with the highest estimated reliability so far. ...
Article
An opportunistic link between two mobile devices or nodes can be constructed when they are within each other's communication range. Typically, cyber-physical environments consist of a number of mobile devices that are potentially able to establish opportunistic contacts and serve mobile applications in a cost-effective way. Opportunistic mobile service computing is a promising paradigm capable of utilizing the pervasive mobile computational resources around the users. Mobile users are thus allowed to exploit nearby mobile services to boost their computing capabilities without investment in their resource pool. Nevertheless, various challenges, especially its quality-of-service and reliability-aware scheduling, are yet to be addressed. Existing studies and related scheduling strategies consider mobile users to be fully stable and available. In this article, we propose a novel method for reliability-aware and deadline-constrained service composition over opportunistic networks. We leverage the Krill-Herd-based algorithm to yield a deadline-constrained, reliability-aware, and well-executable service composition schedule based on the estimation of completion time and reliability of schedule candidates. We carry out extensive case studies based on some well-known mobile service composition templates and a real-world opportunistic contact data set. The comparison results suggest that the proposed approach outperforms existing ones in terms of success rate and completion time of composed services.
... However, this requires cooperation among the cellular users. Since the energy efficiency of D2D communication decreases with the increase in distance between the communicating users [8], [9], two cellular users who are located far away from each other may be better off downloading the content they need directly from the BS instead of cooperating with each other. For example, consider five cellular users {1, 2, 3, 4, 5} requesting the same file from the BS. ...
... So the sums of the U i terms are equal in v(S 1 ) + v(S 2 ) and v(S 1 ∪ S 2 ) + v(S 1 ∩ S 2 ). Hence, by (11), to show that (9) holds, it suffices to show that: ...
Preprint
We consider a set of cellular users associated with a base station (BS) in a cellular network that employs Device-to-device (D2D) communication. A subset of the users request for some files from the BS. Now, some of the users can potentially act as relays and forward the requested files, or partitions of files, from the BS to some of the requesting users (destination nodes) over D2D links. However, this requires cooperation among the cellular users. In this paper, we seek conditions under which users have an incentive to cooperate with each other. We model the above scenario using the frameworks of cooperative game theory and stable partitions in coalitional games. We consider two different models for file transfer within a coalition: (i) Model A, in which the BS can split a file into multiple partitions and send these partitions to different relays, which multicast the partitions to the destination nodes of the coalition, and (ii) Model B, in which for each file, the BS sends the entire file to a single relay, which multicasts it to the destination nodes of the coalition. First, we explore the question of whether it is beneficial for all the cellular users to cooperate, i.e., whether the grand coalition is stable. For this we use the solution concept of core from cooperative game theory. We show that, in general, the above coalitional game under Model A may have an empty core. Next, we provide conditions under which the core is always non-empty and a D_c-stable partition always exists. Also, we show that under Model B, the problem of assigning relays to destination nodes so as to maximize the sum of utilities of all the users is NP-Complete. Finally, we show via numerical computations that a significant reduction in the energy expenditure of cellular users can be achieved via cooperation.
... Beep [11] uses audible signals with off-the-shelf devices, but its positioning accuracy is low due to the latency at the sound card. Assist [12] utilises near-ultrasound pulses generated by a smartphone speaker to achieve centimetre level positioning accuracy, however, the use of Wi-Fi NTP protocol for time synchronisation increases power-consumption considerably [13]. Akkurate [14] uses smartphones as audio receivers and supports a high positioning accuracy in 2D environments, but the accuracy drops dramatically for a real-world 3D location positioning. ...
... The performance parameters of the propagation channels may vary significantly. An acoustic channel's frequency response is ideally considered flat for all in-band frequencies and represented by (13) ...
Article
Full-text available
Indoor location positioning techniques have experienced a significant growth in recent years. This work presents a hybrid indoor positioning system with fine and coarse modes. It utilises acoustic signals for fine positioning and received signal strength (RSS) for coarse location estimation. Acoustic positioning systems require a line‐of‐sight connection for accurate positioning which may not be available due to obstacles in indoor environments. A new solution is presented to overcome this problem using RSS as a reference to validate the line‐of‐sight connection. Moreover, a new digital signal processing algorithm using a matched filter is presented to enhance the system's robustness in indoor environments with low a signal‐to‐noise ratio. Experimental measurement results in an indoor environment show that the proposed solution can accurately determine indoor locations with <6 cm positioning error on average.
... A result from respective investigations into continuous data transmission at 1 kHz suggests that it can be supported for 24 hr. monitoring using a 1200 mAh battery [138]. ...
... The analysis presented in [138] regarding the power consumption and longevity of batteries pertains to transmission energy only, added to it the energy involved in preprocessing the physiological data at the sensor nodes including analog to digital Considering Bluetooth as the primary means of communication, the energy dissipation is directly dependant on the packet format of the data being transmitted which can be optimized using standard duty cycling and might eventually lead to delays and packet loss of data which would be highly undesirable for applications involving remote health monitoring [33]. Therefore, from the long-term system operation perspective, when implementing a wireless body area network (WBAN) comprising of heterogeneous sensors, it is imperative to select data analysis algorithms having low-computational complexity. ...
Thesis
Full-text available
ICT enabled body-worn remote rehabilitation system has been projected as an effective means for combating the major socio-economic challenge resulting from the need for quality care delivery for stroke survivors. The two major problems faced in such systems are: 1) while effective for characterising the patient’s performance during a constrained “exercise phase” in remote settings, the more natural indicator of rehabilitation status, i.e., the patient’s performance in an “unconstrained nomadic environment”, are often not considered and; 2) being body-worn and thus constrained by the battery life, their sustainability for long-term continuous monitoring is questionable. These shortcomings motivated the: 1) exploration of effective algorithmic strategies for accurately detecting movement of affected body parts, more specifically, the movement of the upper limb since it frequently gets affected by stroke episodes – in unconstrained scenarios and; 2) translation of the algorithms to dedicated low-power hardware with an aim of enhancing the battery life of a resource constrained body-worn sensor based remote rehabilitation system for its sustained operation satisfying the notion of long-term continuous monitoring. Following instructions of expert physiotherapists, this work concentrates on detecting three fundamental upper limb movements in unconstrained scenarios: extension/flexion of the forearm; rotation of the forearm about the elbow; and rotation of the arm about the long axis of forearm, using body-worn inertial sensors. After selecting the appropriate type of inertial sensors and their positions through exhaustive experiments, two novel algorithms were proposed to recognize the above mentioned movements: 1) clustering and minimum distance classifier based approach and 2) tracking the orientation of an inertial sensor placed on the wrist. The performances of the algorithms have been evaluated prospectively through an archetypal activity ‘making-a-cup-of-tea’ which includes multiple occurrences of the chosen movements. The proposed clustering based approach detected the three movements with an average accuracy of 88% and 70% using accelerometer data and 83% and 70% using gyroscope data obtained from the wrist for healthy subjects and stroke survivors respectively. Compared to that the proposed sensor orientation based methodology using a wrist-worn accelerometer only recognized the three movements with accuracies in the range of 91-99% for healthy subjects and 70%-85% for stroke survivors. However the clustering based approach provides greater flexibility in terms of incorporating new types of movements apart from the ones chosen here and can also be used to track changes in motor functionality over time. Subsequently it was translated into a novel ASIC resulting in dynamic power consumption of 25.9 mW @20 MHz in 130 nm technology. On the other hand, the sensor orientation based approach was also validated in hardware using an Altera DEII FPGA system, for high speed real-time movement recognition.
... The fundamental problem with continuous data transmission is the energy requirement. A result from respective investigations into continuous data transmission at 1 kHz suggests that it can be supported for 24-h monitoring using a 1200-mAh battery [143]. ...
... The analysis presented in [143] regarding the power consumption and longevity of batteries pertains to transmission energy only, added to it the energy involved in pre-processing the physiological data at the sensor nodes including analogue-todigital conversion, quantization, filtering and the microcontroller operation would bring down the effective time of monitoring to 8-10 h, thereby making the entire system power hungry and affecting the life of the batteries. An increased battery capacity like the prismatic zinc-air battery-1800 mAh operating at 1.4 V used recently in the medical community would increase the respective sizes of the sensor nodes. ...
Chapter
This chapter explores the field of remote sensor systems using wearable technologies that play a significant role in monitoring activities of patients in home and community settings. The focus is on body area sensing networks incorporating the primary enabling technologies: sensors for capturing the physiological and kinematic data, and data analysis techniques for extracting the clinically relevant information. With respect to the StrokeBack project, the majority of this chapter is dedicated towards physical activity monitoring-a key component in stroke rehabilitation. In particular, the domain of upper limb rehabilitation is examined since reduction of upper limb motor function is a common effect of stroke and significantly impairs the performance of patients as they engage in activities of daily life. As an example, a case study is presented where different arm movements are recognized in real time using data from inertial sensors attached to the arm. Tracking the occurrences of specific arm movements (e.g. prescribed exercises) over time can give an indication of rehabilitation progress since the frequency of these movements is expected to increase as motor functionality improves. © Springer International Publishing Switzerland 2016. All rights reserved.
... Even when data links to a cellular network are available, mobile users can still take advantage of the local WiFi contacts to exchange large amounts of data faster and in CHAPTER 1. INTRODUCTION 3 a more energy efficient fashion than transmitting data over GSM. In fact, power analysis of different wireless interfaces shows that for high transmission rates GSM radio consumes considerably more power than WiFi radio [Balani, 2007]. ...
... All these peripheral equipments consume energy and place growing pressure on the device's battery life. On the other hand, advances in battery design have stalled for the past few years[Balani, 2007].With recent advances in mobile multimedia communication, efficient usage of bandwidth is considered as an important design criterion in addition to energy in developing new mobile technologies. Routing algorithms that are designed for mobile DTNs transmit data to its destination through intermediate nodes, also known as relays, which might not be interested in the data themselves. ...
... We use GSM IoT (EC-GSM-IoT) services for transmission because it is a low-power wide area network technology to provide long-range, low-complexity cellular communications for IoT devices in a way that conserves energy. Various studies have proved the suitability of GSM IoT service and compared other standards such as Wi-Fi and cellular networks [24,25]. The UAVs' transmission ranges and coverage areas are based on the nodes' density at the ground [26]. ...
Article
Full-text available
New technologies and communication standards have changed traditional network processes. Internet of Things (IoT) is one of the emerging technologies where devices are connected to facilitate the users. When the networks are more congested due to a large number of users then the existing routing protocol and communication channels suffer from congestion, disconnection, overhead, and packet drop issues. Unmanned Aerial Vehicles (UAVs) are adopted to support the ground networks for more feasible data communication. These networks provide coverage facilities to IoT networks and provide smooth data dissemination services. Through the use of relay and cooperative communication technologies, UAVs could enlarge the communication space for IoT networks. Traditional network routing protocols have been adopted for data communication in these networks. However, the adopted protocols are not able to handle mobility and uncertain network conditions. This paper proposes a Decision-based Routing for Unmanned Aerial Vehicles and Internet of Things (DR-UAVIoT) network. The proposed protocol is useful for UAV-to-IoT and UAV-to-UAV data communication. The performance of the proposed solution is evaluated with the existing protocols in terms of data delivery, delay, and network overhead. The experimental results indicate the better performance of the proposed protocol in terms of less delay, less overhead, and better data delivery ratio as compared with existing routing protocols.
... Conventional cellular standards (3G, 4G) were designed to provide global coverage, but they consume too much energy for battery-powered devices [2]. The introduction of LPWANs has enabled developers to opt for a cost-effective low-power connectivity technology while still enabling long-range communication. ...
Article
The broad range of requirements of IoT applications has led to the development of several dedicated communication technologies, each tailored to meet a specific feature set. A solution combining different wireless technologies in one device can overcome the disadvantages of any individual technology. The design of such multi-RAT solutions based on the diverse characteristics of the technologies offers interesting opportunities. We have assessed both the potential gains and the overhead brought by a multi-RAT solution. To that end, we have evaluated key IoT node requirements in function of payload size and link quality: energy efficiency, coverage, payload size, latency performance, QoS, and cost efficiency. Our assessment and experimental validation of these features show the merits of a multi-RAT solution. Notably, energy consumption in use cases with only sporadic large payload requirements can be improved by a factor of at least 4 with respect to single-mode technologies. Moreover, latency-critical messages can get delivered on time, and coverage can be extended elegantly where needed.
... The average energy consumptions of a base transceiver station (BTS) is associated with the communication technology being used, as depicted in the literature while comparing Global System Mobile (GSM) and Universal Mobile Telephone Services (UMTS), it was noted that the GSM considerably have higher energy consumption than the UMTS technology [28]. Similarly, multiple energy consumption analyses have been performed for various communication technologies such as for Bluetooth, Wi-Fi and Cellular Networks [29,30]. ...
Article
Full-text available
Cellular networks based on new generation standards are the major enabler for Internet of things (IoT) communication. Narrowband-IoT and Long Term Evolution for Machines are the newest wide area network-based cellular technologies for IoT applications. The deployment of unmanned aerial vehicles (UAVs) has gained the popularity in cellular networks by using temporary ubiquitous coverage in the areas where the infrastructure-based networks are either not available or have vanished due to some disasters. The major challenge in such networks is the efficient UAVs deployment that covers maximum users and area with the minimum number of UAVs. The performance and sustainability of UAVs is largely dependent upon the available residual energy especially in mission planning. Although energy harvesting techniques and efficient storage units are available, but these have their own constraints and the limited onboard energy still severely hinders the practical realization of UAVs. This paper employs neglected parameters of UAVs energy consumption in order to get actual status of available energy and proposed a solution that more accurately estimates the UAVs operational airtime. The proposed model is evaluated in test bed and simulation environment where the results show the consideration of such explicit usage parameters achieves significant improvement in airtime estimation.
... Research shows that one protocol is better than other protocols in terms of packet size and data rate. 21 Again, since the IoT devices are battery operated, varying the data rate also changes the battery consumption. In the view of the fact that energy conservation is one of the important features, 22 recharging or replacing the battery at a frequent rate is not suggested. ...
Article
With the proliferation of Internet of Things (IoT) and edge computing paradigms, billions of IoT devices are being networked to support data‐driven and real‐time decision making across numerous application domains, including smart homes, smart transport, and smart buildings. These ubiquitously distributed IoT devices send the raw data to their respective edge device (eg, IoT gateways) or the cloud directly. The wide spectrum of possible application use cases make the design and networking of IoT and edge computing layers a very tedious process due to the: (i) complexity and heterogeneity of end‐point networks (eg, Wi‐Fi, 4G, and Bluetooth); (ii) heterogeneity of edge and IoT hardware resources and software stack; (iv) mobility of IoT devices; and (iii) the complex interplay between the IoT and edge layers. Unlike cloud computing, where researchers and developers seeking to test capacity planning, resource selection, network configuration, computation placement, and security management strategies had access to public cloud infrastructure (eg, Amazon and Azure), establishing an IoT and edge computing testbed that offers a high degree of verisimilitude is not only complex, costly, and resource‐intensive but also time‐intensive. Moreover, testing in real IoT and edge computing environments is not feasible due to the high cost and diverse domain knowledge required in order to reason about their diversity, scalability, and usability. To support performance testing and validation of IoT and edge computing configurations and algorithms at scale, simulation frameworks should be developed. Hence, this article proposes a novel simulator IoTSim‐Edge, which captures the behavior of heterogeneous IoT and edge computing infrastructure and allows users to test their infrastructure and framework in an easy and configurable manner. IoTSim‐Edge extends the capability of CloudSim to incorporate the different features of edge and IoT devices. The effectiveness of IoTSim‐Edge is described using three test cases. Results show the varying capability of IoTSim‐Edge in terms of application composition, battery‐oriented modeling, heterogeneous protocols modeling, and mobility modeling along with the resources provisioning for IoT applications.
... Research shows that one protocol is better than other protocols in terms of packet size and data rate. 21 Again, since the IoT devices are battery operated, varying the data rate also changes the battery consumption. In the view of the fact that energy conservation is one of the important features, 22 recharging or replacing the battery at a frequent rate is not suggested. ...
Preprint
This paper proposes a novel simulator IoTSim-Edge, which captures the behavior of heterogeneous IoT and edge computing infrastructure and allows users to test their infrastructure and framework in an easy and configurable manner. IoTSim-Edge extends the capability of CloudSim to incorporate the different features of edge and IoT devices. The effectiveness of IoTSim-Edge is described using three test cases. The results show the varying capability of IoTSim-Edge in terms of application composition, battery-oriented modeling, heterogeneous protocols modeling and mobility modeling along with the resources provisioning for IoT applications.
... Another approach to analyze the energy consumption of mobile devices is from the applications perspective. In this context, some works [13,14] have carried out measures on the consumption related to the use of applications such as Bluetooth and SMS service. This experiment was performed 15 times to each state in a closed environment, using the Bluetooth 2.0 technology. ...
... A large number of studies have been carried out by many researchers to find out power consumption characteristics in mobile devices. Rahul Balani [8] analyzed energy consumption for various radio transmission technologies such as BT, Wi-Fi, 2G, and 3G. Andrew Rice et al. [9] developed a measurement framework that logs the traces of the consumed power. ...
Article
Full-text available
The use of mobile devices has increased many folds over the last few years. Smart phones are used not only for communication but also for storage of personal contents like photos, videos, documents, and bank account and credit/debit card details. Secure access to these contents is very important. Nowadays, many mobile devices come with inbuilt biometric-enabled security features like fingerprint, face recognition, and iris. However, additional battery power is consumed each time a user unlocks the device using any one of these security features. In order to enable prolonged use of the device, there is a strong need to find ways to conserve power in mobile devices. At the same time, it is also equally important that the smart phone user knows how long the battery of his device will last. In this paper, we present a novel power optimization and battery lifetime prediction framework called P4O (Pattern, Profiling, Prediction, and Power Optimization). Our contributions are threefold—(i) Propose a novel framework for power optimization in smart phones. (ii) Propose a new approach for battery lifetime forecast. (iii) Implement and validate the efficacy of the proposed framework. For experimental results, the proposed framework was implemented on the Android-based smartphone. The experimental results validate the proposed framework with power optimization up to 40% over default Linux and Android power saving features available in an Android operating system. This framework is also able to forecast battery lifetime with accuracy of up to 98%.
... Therefore, it is still essential to transmit large numbers of raw datasets through a WSN with a high transmission speed to implement the remote and real-time machine condition monitoring. The relationship between power consumption and transmission rate has been researched, unveiling that the long-term data transmission at a low rate can consume more power than a short-term transmission at a high rate when the effect of transmission distance is excluded [40,41]. Therefore, a wireless network with a higher transmission rate (for example BLE, Wi-Fi and ZigBee) is critical in order to achieve remote machine condition monitoring in real time. ...
Article
Full-text available
Condition monitoring can reduce machine breakdown losses, increase productivity and operation safety, and therefore deliver significant benefits to many industries. The emergence of wireless sensor networks (WSNs) with smart processing ability play an ever-growing role in online condition monitoring of machines. WSNs are cost-effective networking systems for machine condition monitoring. It avoids cable usage and eases system deployment in industry, which leads to significant savings. Powering the nodes is one of the major challenges for a true WSN system, especially when positioned at inaccessible or dangerous locations and in harsh environments. Promising energy harvesting technologies have attracted the attention of engineers because they convert microwatt or milliwatt level power from the environment to implement maintenance-free machine condition monitoring systems with WSNs. The motivation of this review is to investigate the energy sources, stimulate the application of energy harvesting based WSNs, and evaluate the improvement of energy harvesting systems for mechanical condition monitoring. This paper overviews the principles of a number of energy harvesting technologies applicable to industrial machines by investigating the power consumption of WSNs and the potential energy sources in mechanical systems. Many models or prototypes with different features are reviewed, especially in the mechanical field. Energy harvesting technologies are evaluated for further development according to the comparison of their advantages and disadvantages. Finally, a discussion of the challenges and potential future research of energy harvesting systems powering WSNs for machine condition monitoring is made.
... As an example, we consider reducing the number of needed GPS and Ozone level measurements by sharing them through Bluetooth. Indeed, with an average power consumption of less than 20mW [15], Bluetooth is a very good candidate for replacing the pluggable Ozone sensors [226] (more than 400mW according to our measurements) if another device can share it. Designing an energy-efficient sensing protocol for such distributed mobile sensing scenarios is non-trivial and we have the following challenges: ...
Article
With the ever increasing adoption of smartphones worldwide, researchers have found the perfect sensor platform to perform context-based research and to prepare for context-based services to be also deployed for the end-users. However, continuous context sensing imposes a considerable challenge in balancing the energy consumption of the sensors, the accuracy of the recognized context and its latency. After outlining the common characteristics of continuous sensing systems, we present a detailed overview of the state of the art, from sensors sub-systems to context inference algorithms. Then, we present the three main contribution of this thesis. The first approach we present is based on the use of local communications to exchange sensing information with neighboring devices. As proximity, location and environmental information can be obtained from nearby smartphones, we design a protocol for synchronizing the exchanges and fairly distribute the sensing tasks. We show both theoretically and experimentally the reduction in energy needed when the devices can collaborate. The second approach focuses on the way to schedule mobile sensors, optimizing for both the accuracy and energy needs. We formulate the optimal sensing problem as a decision problem and propose a two-tier framework for approximating its solution. The first tier is responsible for segmenting the sensor measurement time series, by fitting various models. The second tier takes care of estimating the optimal sampling, selecting the measurements that contributes the most to the model accuracy. We provide near-optimal heuristics for both tiers and evaluate their performances using environmental sensor data. In the third approach we propose an online algorithm that identifies repeated patterns in time series and produces a compressed symbolic stream. The first symbolic transformation is based on clustering with the raw sensor data. Whereas the next iterations encode repetitive sequences of symbols into new symbols. We define also a metric to evaluate the symbolization methods with regard to their capacity at preserving the systems' states. We also show that the output of symbols can be used directly for various data mining tasks, such as classification or forecasting, without impacting much the accuracy, but greatly reducing the complexity and running time. In addition, we also present an example of application, assessing the user's exposure to air pollutants, which demonstrates the many opportunities to enhance contextual information when fusing sensor data from different sources. On one side we gather fine grained air quality information from mobile sensor deployments and aggregate them with an interpolation model. And, on the other side, we continuously capture the user's context, including location, activity and surrounding air quality. We also present the various models used for fusing all these information in order to produce the exposure estimation.
Article
Internet of Medical Things (IoMT) is igniting many emerging smart health applications, by continuously streaming the big data for data-driven innovations. One critical obstacle in IoMT big data is the power hungriness of long-term data transmission. Targeting this challenge, we propose a novel framework called, IoMT Big-data Bayesian-backward Deepencoder learning (IBBD), which mines deep autoencoder (AE) configurations for data sparsification and determines optimal trade-offs between information loss and power overhead. More specifically, the IBBD framework leverages an additional external Bayesian-backward loop that recommends AE configurations, on top of a traditional deep learning loop that executes and evaluate the AE quality. The IBBD recommendation is based on confidence to further minimize the regularized metrics that quantify the quality of AE configurations, and it further leverages regularization techniques to allow adjusting error-power tradeoffs in the mining process. We have conducted thorough experiments on a cardiac data streaming application and demonstrated the superiority of IBBD over the common practices like Discrete Wavelet Transform, and we have further generalized IBBD through validating the optimal AE configurations determined on one user to other users. This study is expected to greatly advance IoMT big data streaming practices towards precision medicine.
Article
We consider a set of cellular users associated with a base station (BS) in a cellular network that employs Device-to-device (D2D) communication. A subset of the users request for some files from a server associated with the BS. Now, some of the users can potentially act as relays and forward the requested files, or partitions of files, from the BS to some of the requesting users (destination nodes) over D2D links. However, this requires cooperation among the cellular users. Also, when cellular users cooperate with each other, the total amount of energy consumed in transferring the requested files from the BS to the destination nodes can usually be considerably reduced compared to the case when each user separately downloads the file it needs from the BS. In this paper, we seek conditions under which users have an incentive to cooperate with each other. To this end, we model the above scenario using the frameworks of cooperative game theory and stable partitions in coalitional games. We consider two different models for file transfer within a coalition: (i) Model A, in which the BS can split a file into multiple partitions and send these partitions to different relays, which multicast the partitions to the destination nodes of the coalition, and (ii) Model B, in which for each file, the BS sends the entire file to a single relay, which multicasts it to the destination nodes of the coalition. First, we explore the question of whether it is beneficial for all the cellular users to cooperate, i.e., whether the grand coalition is stable. For this we use the solution concept of \emph{core} from cooperative game theory. We show that, in general, the above coalitional game under Model A may have an empty core, i.e., it may not be possible to stabilize the grand coalition. Next, we provide conditions under which 1) the core is always non-empty and 2) a \emph{ $\mathbb{D}_c$ -stable partition} always exists. Also, we show that under Model B, the problem of assigning relays to destination nodes so as to maximize the sum of utilities of all the users is NP-Complete. Finally, using numerical computations, we evaluate the energy consumption of the cellular users under the cooperation and no cooperation cases for Model A and the performance of different heuristics for solving the above problem of assigning relays to destination nodes under Model B.
Chapter
Full-text available
The availability of effective communications in post-disaster scenarios is key to implement emergency networks that enable the sharing of critical information and support the coordination of the emergency response. To deliver those levels of QoS suitable to these applications, it is vital to exploit the multiple communication opportunities made available by the progressive deployment of the 5G and Smart City paradigms, ranging from ad-hoc networks among smartphones and surviving IoT devices, to cellular networks but also drone-based and vehicle-based wireless access networks. Therefore, the user device should be able to opportunistically select the most convenient among them to satisfy the demands for QoS imposed by the applications and also minimize the power consumption. The driving idea of this paper is to leverage non-cooperative game theory to design such an opportunistic user association strategy in a post-disaster scenario using UAV ad-hoc networks. The adaptive game-theoretic scheme allows increasing of the QoS of the communication means by lowering the loss rate and also keeps moderate the energy consumption.
Article
A method for electrocardiogram (ECG) feature extraction is presented for automatic classification of heartbeats, using values of RR intervals, amplitude and Hjorth parameters. Hjorth parameters have been used in a variety of research areas, but their application to ECG signal processing is still little explored. This paper also introduces a new approach to heartbeat segmentation, which avoids mixing information from adjacent beats and improves classification performance. The proposed model is validated in the Massachusetts Institute of Technology - Beth Israel Hospital (MIT-BIH) Arrhythmia database and presents an overall accuracy of 90.4%, better than other state-of-the-art methods. There is an improvement over other models in positive predictivity for class S (66.6%) of supraventricular ectopic beats, and sensitivity for class N (93.0%). Results obtained indicate that the techniques used in this study can be successfully applied to the problem of automatic heartbeat classification. In addition, this new approach has low computational cost, which allows its later implementation in hardware devices with limited resources.
Article
Full-text available
Diabetic retinopathy is the basic reason for visual deficiency. This paper introduces a programmed strategy to identify and dispense with the blood vessels. The location of the blood vessels is the fundamental stride in the discovery of diabetic retinopathy because the blood vessels are the typical elements of the retinal picture. The location of the blood vessels can help the ophthalmologists to recognize the sicknesses prior and quicker. The blood vessels recognized and wiped out by utilizing Gobar filter on two freely accessible retinal databases which are STARE and DRIVE. The exactness of segmentation calculation is assessed quantitatively by contrasting the physically sectioned pictures and the comparing yield pictures, the Gabor filter with Entropic threshold vessel pixel segmentation by Entropic thresholding is better vessels with less false positive portion rate.
Conference Paper
We demonstrate a prototype that takes advantage of open-source software to put a full-text searchable copy of Wikipedia on a Raspberry Pi, providing nearby devices access to content via wifi or bluetooth without requiring internet connectivity. This short paper articulates the advantages of such a form factor and provides an evaluation of browsing and search capabilities. We believe that personal digital libraries on lightweight mobile computing devices represent an interesting research direction to pursue.
Article
A Distributed Virtual Environment (DVE) system provides a shared virtual environment where physically separated users can interact and collaborate over a computer network. There are three major challenges to improve DVE scalability: effective DVE system performance measurement, understanding the controlling factors of system performance/quality and determining the consequences of DVE system changes. We describe a DVE Scalability Engineering (DSE) process that addresses these three major challenges for DVE design. The DSE process allows us to identify, evaluate, and leverage trade-offs among DVE resources, the DVE software, and the virtual environment. We integrate our load simulation and modeling method into a single process to explore the effects of changes in DVE resources.
Conference Paper
Home networks have become heterogeneous environment hosting a variety of wireless and wired telecommunication technologies. Currently there no exist any intelligent energy saving mechanism to control the home networks. In this paper we propose an energy-aware strategy that integrate Wireless Sensor Network (WSN) with a convergent digital home network. We aim to demonstrate that a WSN can act as a dependable control plane to manage the high speed home network. While the home network nodes can be deactivated, the WSN is always on and, due to the low data rate required to properly work, it consumes a very limited quantity of energy. This mutual interaction leads to a substantial reduction of the energy consumptions. Simulation results show that this strategy is effective in different scenarios and provides a tangible economic benefit.
Article
A large number of new data consuming applications are emerging in the daily routines of mobile users. Device-to-Device (D2D) communication as a new paradigm is introduced to reduce the increasing traffic and offload it to the user equipment (UE). With the development of UE multi-radio interface, we first develop a new hybrid architecture concept for D2D communication. The architecture combines ISM 2.4G spectrum as the Out-Band mode using Bluetooth and Wifi-Direct with the cellular spectrum as the In-Band mode. Secondly, we design a scheme that forms the Out-Band cluster and makes the following periodic signaling interaction via the Bluetooth interface. Traffic is transferred via the Wifi-Direct interface inside the cluster but carried on the cellular spectrum among the clusters. Simulation results show that our proposal increases the system throughput, saves power consumption and prolongs the clusters lifetime.
Article
This paper addresses the problem of energy-aware multihop cooperation among the mobile terminals (MTs) that cooperatively download a common content from a wireless network. The base station (BS) unicasts or multicasts the content to selected MTs that, in turn, either unicast or multicast it to other MTs, forming a multihop ad hoc network with a predefined maximum allowed number of hops. This paper presents the optimization formulations whose solution gives the exact optimal set of receiving MTs from the BS, the optimal multihop ad hoc network, and the optimal unicasting and multicasting transmission bit rates that minimize the total energy consumption of the MTs. Second, a simplified multicasting formulation is proposed that has a close-to-optimal performance with notably lower computational complexity. Third, interference avoidance among the transmitting MTs is considered. For each presented formulation, the complexity is identified, and results show that some formulations can be efficiently solved for medium network sizes, while others are more computationally complex. Thus, polynomial-time heuristic solutions are presented that have close-to-optimal performance. Results demonstrate remarkable energy consumption reduction gains and wireless resources savings under various network scenarios.
Article
The subject of this article is to bring the problem analysis of vehicle communication through diagnostic interface OBDII with diagnostic tools used in automotive diagnostics with an emphasis on Bluetooth communication. Based on the analysis, this paper deals with the description of software design and its implementation to mobile devices on purpose of communication with CAN OBDII interface. For this type of communication is usually used diagnostically-oriented communication processor ELM 327. Aim of this solution is to bring an application provided for mobile devices with Android operating system, which can obtain real time data from engine control unit through ELM 327.
Article
In military action, marching is a common method used for supply-troop movement. Supply routes are typically in the wilderness where the route conditions change over time. This paper proposes a power-saving algorithm allowing supply troops to collect route information using wireless sensor network technology. Each member in the marching supply troop is equipped with a battery-powered sensor. To save power consumption, the proposed methods schedule the sleeping period for each member according to the size of the marching supply troop and its moving velocity. Two data carrying methods are proposed to reduce the frequency of long-distance data uploading. The first method allows the uploaded data to be carried within a single-round data collection period, and the second method extends the data carrying period to multiple rounds. The simulation results show that scheduling a sleep period can prolong the sensing distance along the route. These two proposed methods can add an additional 18–70% in distance data over methods without scheduling a sleep period. The energy spent on long-distance data transmissions can be improved by 7–25%. Copyright © 2015 John Wiley & Sons, Ltd.
Article
In the last decade, one of the main goals in wireless telecommunications has been to reduce energy consumption of mobile devices. However, making a network device green can cause performance deterioration. The target of this paper is to propose a cross-layer approach for the design of a mobile video cloud for the uplink transmission towards the Internet. The proposed approach is adaptive in both the video sources and the wireless transmitter. A source Rate Controller is applied to compensate transmission bandwidth reduction due to the energy saving policies. Energy saving in wireless transmission on the mobile cloud cellular channel is achieved by introducing an energy-efficient ARQ protocol. This protocol can apply different transmission laws, in order to exploit the correlation of the cellular channel behavior. An analytical model of the system is defined to compare the transmission laws, and provide some design guidelines to choose one of them and design its parameters.
Conference Paper
Multi-Sensory mobile health monitoring systems promise substantial improvements in the quality of healthcare. However, large-scale trials are uncovering key areas that inhibit long-term large-scale deployments, including power consumption and lifetime issues, and high communication overhead. Traditional techniques can efficiently resolve these issues while maintaining semantic fidelity of the sensed medical signal, but also amplify the signal's sensitivity to sensor faults, thereby reducing system safety. We propose a set of statistical techniques to optimize system power and bandwidth consumption, while adhering to signal fidelity and sensor fault diagnosis requirements. By defining signal fidelity in terms of its semantic value, and formulating the problem as a sensor subset selection wherein mutual information rather than aggregate signal quality is maximized, we show that power consumption in a wireless human gait monitoring system can be reduced by up to 78% while accurately estimating many functional gait assessment metrics and precisely diagnosing semantic faults.
Article
In this paper we focus on a Residential Community Network (RCN) in which the various home gateways have (limited) caching capabilities, can be shared among the users, and can exploit multi-hop wireless paths to communicate with each other. We analyse such a RCN from an energy-aware perspective, investigating its energy-saving potential in delivering the contents to the end users. To this aim, we define a resource allocation and routing scheme and, by means of an integer linear programming model, we solve to optimality the problem of associating the user terminals to the home gateways and computing the data flow paths. Accordingly, the resulting allocation pattern can serve as an upper bound for the global “greening” capability of the RCN. A comparison between our method and other techniques for resource sharing in RCNs sheds light on the pros and cons of both our scheme and the other approaches.
Article
The increasing demand for large data downloads on cellular networks is increasing congestion which decreases end user quality of service. This work addresses the problem of offloading the cellular network while distributing common content to a group of mobile devices that cooperate during the download process by forming device-to-device communication networks. The base station unicasts different chunks of the content to selected mobile devices that multicast it to each other over local ad hoc networks using multihop cooperation while maintaining fairness constraints on the energy consumption of the mobile devices. The optimal cellular offloading problem is formulated as a mixed integer linear programming problem and the corresponding complexity is analyzed. Then, a dynamic programming approach is proposed to adapt the solution to the dynamics of the network as the mobile devices move. Cellular offloading assuming single hop cooperation among the mobile devices proves to be significantly less computationally complex than cooperation using a higher number of hops; however both problems are NP-complete. Thus, polynomial time greedy algorithms are presented to obtain computationally fast solutions with good performance. Performance results demonstrate that significant cellular offloading gains can be achieved, even if only a very small fraction of the mobile devices' battery levels can be consumed for cooperation.
Conference Paper
In this paper first we present an overview of the state-of-the-art remote patient monitoring systems in the backdrop of real clinical needs. The paper establishes a clear guideline in terms of clinical expectations from such a system from the viewpoint of practicing clinicians. It provides in-depth analysis of the shortcomings of the existing architectures and paves a way towards developing a practical “patient-centric” architecture that could be useful in the day-to-day clinical practice for providing “continuum of care”. Subsequently, the restrictions imposed by the resource constrained nature of such a system on development of appropriate hardware for supporting information processing on the data acquired by body-worn sensors are analyzed.
Conference Paper
The set of services, provided by the mobile phone platform, is becoming increasingly complex and requiring more computational power, hence higher energy consumption and compromising the autonomy of these devices. Thus, it is important to identify scenarios where we could apply methods to decrease such consumption and extend the mobile autonomy. This work presents practical results of an experimental study that evaluates four scenarios. For these scenarios, a control threshold was defined and, after that, important variables were modified so that we could analyze the impact of these variables in the energy consumption. From this analysis we could identify opportunities to develop new methods for saving energy, as the examples listed in this paper.
ResearchGate has not been able to resolve any references for this publication.