746 IEEE JOURNAL OF BIOMEDICAL AND HEALTH INFORMATICS, VOL. 18, NO. 3, MAY 2014
Augmenting Breath Regulation Using a Mobile
Driven Virtual Reality Therapy Framework
Ahmad Abushakra and Miad Faezipour, Member, IEEE
Abstract—This paper presents a conceptual framework of a vir-
tual reality therapy to assist individuals, especially lung cancer
patients or those with breathing disorders to regulate their breath
simulations and treatment, particularly for patients with cancer.
The theories, methodologies and approaches, and real-world dy-
(VRT) via a conceptual framework using the smartphone will be
discussed. The architecture and technical aspects of the offshore
platform of the virtual environment will also be presented.
Index Terms—Breathing movement classification, lung capacity
estimation, virtual therapy, visualization.
cers include the lung organs. For this reason, research on lung
cancer prevention and treatment has received much attention
across the medical fields as well as various systems/engineering
One of the most challenging tasks for lung cancer patients
is concerned with breathing, as the lungs are responsible for
respiration. Lungs distribute oxygen to the entire body through
blood flow via blood cells . Hence, any form of exercise that
in providing oxygen to the rest of the body, and as a result, can
have proven that inducing relaxation responses improves the
efficiency of the immune system, while stress retards it , .
Lung cancer patients and those with breathing disorders are
also subject to serious attacks in which the patient’s lung ca-
pacity is significantly reduced. This can even lead to severe
shortness of breath and may require emergency treatment .
Therefore, postoperative breathing exercises can decrease lung
problems by encouraging the patient to take deep breaths .
UNG cancer is considered as the number one leading
cancer-caused death in the United States. In addition, ac-
5, 2013. Date of publication September 9, 2013; date of current version May 1,
The authors are with the University of Bridgeport, Bridgeport, CT 06604
USA (e-mail: aabushak@ bridgeport.edu; firstname.lastname@example.org).
Color versions of one or more of the figures in this paper are available online
Digital Object Identifier 10.1109/JBHI.2013.2281195
Fig. 1. Smart-phone application of the proposed work.
In the recent years, virtual reality therapy (VRT) has been
explored by many researchers to a large extent , .
Virtual Reality (VR) through simulation, has the potential of
assisting in breath regulation, especially for those with breath-
experience in which a person is surrounded by a 3-D computer-
generated representation, and is able to move around in the
virtual world, see it from different angles, reach into it, grab
it and reshape it” . The interactive immersive features of a
virtual environment would provide a rich interactive and con-
textual setting to support experiential and active therapy. An-
other attractive advent is that smartphone devices are nowadays
becoming increasingly popular in several aspects of our daily
lives , and the usability; especially, the effectiveness and
acceptance of smartphone applications by patients has become
an appealing matter . As virtual reality using smartphones
has also evolved rapidly in the scale and scope of developing
virtual reality applications for medical purposes , applying
virtual reality therapy to assist lung cancer patients via smart-
phones appears to be a remarkable area under investigation, yet
In this paper, the conceptual framework of a smartphone-
based VRT is presented that monitors breathing movements and
assists users by encouraging them to regulate their breath and
increase the oxygen percentage in their blood via an interactive
platform. The overall framework is shown in Fig. 1. The paper
will first discuss the underlying theories and recent research
work on virtual reality environments for therapy purposes. The
conceptual framework for the implementation of our VR plat-
2168-2194 © 2013 IEEE. Personal use is permitted, but republication/redistribution requires IEEE permission.
See http://www.ieee.org/publications standards/publications/rights/index.html for more information.
ABUSHAKRA AND FAEZIPOUR: AUGMENTING BREATH REGULATION USING A MOBILE DRIVEN VRT FRAMEWORK747
architecture and the technical aspects of the virtual environment
platform. The overall full-fledge implementation and test plan,
however, is left as a future plan.
In this paper, we describe the conceptual framework of a
virtual reality therapy environment to aid individuals by moni-
breathing exercises through computer-aided visions to increase
oxygen intake in their blood. The individual’s breathing sound
is used as an interfacing signal between the user/patient and the
smartphone-based VR framework. The framework analyzes the
patient’s breath in real-time and provides virtually real anima-
tions of the lungs as the inhalation and exhalation take place.
The lung capacity is computed simultaneously as the patient is
breathing and the patient is encouraged to take the next coming
breath deeply if the previous one was not sufficient. For lung
cancer patients, the framework animations also show cancerous
cells diminishing virtually as the patient’s breath is being regu-
lated. This, in turn, may have the potential to virtually boost the
immune system, decrease lung cancer symptoms and increase
the chance of survivability of patients with cancer.
D. Paper Organization
The rest of this paper is organized as follows. Section II
glances at earlier work related to virtual reality therapy frame-
works. In Section III, our proposed conceptual framework is
presented and the VRT framework components are elaborated.
The developmental framework of our virtual reality therapy
system is explained in Section VI. The conclusions and recom-
mendations for future work appear in Section V.
II. RELATED WORK
It is noteworthy to mention that the connection between the
brain and body is comprehensible, with a powerful belief that
visualizing something in the brain encourages the body to make
it happen . This appealing fact motivated many researchers
to work along this path.
virtual reality application for cancer patients has been presented
in  and . These work present facts showing the effective-
ness of using self-imagery to stimulate the immune system in
order to enhance its efforts to protect the body from the disease.
Moreover, while stress has been shown to retard the immune
system, relaxation has the opposite effect. In , this was the
motivation to develop a virtual reality visualization tool called
Staying Alive for cancer patients. The system offers a motiva-
tion for users to relax while instantly visualizing their immune
systems fighting against their diseases. White and red blood
cells and malignant cells populate the virtual environment. The
user navigates a white blood cell through the blood stream and
“digests” malignant cells found along the way. In this virtual
environment, the user, free of wires and other such encum-
berments, engages the system by sitting in a room, practicing
certain gestures to virtually fight against the disease. Computer
vision techniques are applied to monitor the user’s hands and
head in 3-D and a Hidden Markov Model framework is applied
to identify the motions in real time.
Virtual reality systems have also been presented in  in
order to stimulate the patients while receiving their treatment.
The authors stated that VR systems improve the patients’ state
of mind, thereby improving their immune system. Their study
involves studying the factors of VR in increasing pain tolerance
for chemotherapy. The authors also introduce the advantages
and difficulties of using VR as an added therapy to reduce pain.
Their study is based on the fact that diversion from unwanted
feelings such as pain and stress is one of the most effective
therapy techniques in dealing with such feelings/diseases.
In , the authors present the development and design of
a breathing interface and video game to promote compliance
with postoperative breathing exercises. An interactive spirome-
ter was introduced to motivate patients to perform postoperative
breathing exercises. The work, however, mostly described the
early progress of an interaction device, game design, initial play
testing, and the usability of the game.
In , a virtual reality therapy system was introduced for
the treatment of acrophobia and therapeutic cases with the ob-
jective of developing an affordable and more realistic virtual
environment to perform visual therapy for acrophobia. The vi-
sualization was a PC-based framework using a virtual scene of
a bunge-jump tower in the middle of a big city. The overall sys-
tem has proven that VR therapy environments are successful,
realistic, and at the same time very fascinating.
III. PROPOSED CONCEPTUAL FRAMEWORK
According to many research investigations, and based on true
numbers; VRT has proven to increase the chances of surviv-
ability of patients with cancer by 56% . This type of therapy
relieves the patient in a better and healthier virtual world, which
directly increases their level of hope, and empowers the im-
mune system. Similar to how stress can adversely affect any
pain or disease in general, and increase the growth of cancer
cells in particular, soothing and relaxing treatments, even in a
fictitious world, has been a very effective method for treatment
and recovery , . Moreover, breathing therapy is an es-
tablished and serious supplementary treatment method for lung
cancer .Thus,thepresented VRapproach has thepotentialto
further optimize breathing therapy for patients with lung cancer
and in general.
With this motivation, the development of a virtual environ-
ment is proposed here which is intended to aid any user in gen-
eral, and lung cancer patients in particular, by monitoring the
user’s breathing movements and motivating them to perform
interactive regulative breathing exercises and simultaneously
providing quantitative measurement of the progress and com-
pliance. The framework includes 3-D computer animations of
the human body, surfing through various tissue layers and cells,
and eventually landing on the lung organ and cells. For patients
with cancer, the patient is urged to diminish his/her own cancer-
that he/she performs in this virtual environment.
748 IEEE JOURNAL OF BIOMEDICAL AND HEALTH INFORMATICS, VOL. 18, NO. 3, MAY 2014
Fig. 2.Components of the proposed VRT framework.
This study introduces an on-going research of a conceptual
ing the breathing movements from the user through the acoustic
signal of respiration is one of the major inputs of the frame-
work. In addition to the acoustic breathing signal, the age, gen-
der, height, and the cancer stage of the patient (in case of lung
cancer patients) will be other input factors of the framework.
The framework will be a smartphone application (see Fig. 1)
that will interface a breathing movement classification compo-
nent, a lung capacity estimation component, and a visualization
component (see Fig. 2). All the components will be integrated
together to interact and to produce the overall framework output
which includes 3-D virtual computer animations of the human
lung cells moving according to the corresponding breathing
movements of the user/patient. The patient will be encouraged
to regulate his/her breath as he/she is observing the virtual lung
organ/cell movements in real time. In what follows, we describe
the components of our VRT framework.
A. Lung Capacity Estimation
One of the main components of our VRT framework is the
lung capacity estimation module. This component gives an es-
timate of the air-volume entering/exiting the lungs using the
acoustic signal of respiration.
pause , . In this part, we identify the breathing phases
and extract certain metrics of each phase to model and estimate
the lung capacity. Voice Activity Detection (VAD) is one of the
most effective functions that can differentiate between silence
of breath using a microphone and segmenting the breath cycles
into inhale and exhale speech phases as shown in Fig. 3, the
average time duration and energy of the breathing cycle can
be computed to easily estimate the lung capacity by using the
following metrics and factors.
1) Gender and adult state: The subject’s gender can affect
the computation result, since physiologically, female lung
sizes are generally different from male lungs.
2) Age: Subject’s age in years.
3) Height : Subject’s height in inches.
4) Duration: The duration of time which the subject blows
air in the microphone.
Fig. 3. Segmenting acoustic signal of breath using VAD.
The following steps show the basics of the lung capacity
1) First, the recorded acoustic breathing signal for each
speaker/subject is split into inhale and exhale speech seg-
ments. The splitting process has been implemented using
the VAD technique.
2) Second, the start and end point of each speech segment
(inhale and exhale duration) is marked.
3) Third, the time duration between each start and end of
speech segments (inhale and exhale) are computed.
4) Fourth, the energy of the signal between each speech seg-
5) Finally, the lung capacity, also referred to as forced vital
capacity (FVC), is modeled using the following equations
(separate equations for male and female subjects) by sub-
stituting the five important factors: gender (m or f), age
(a), height (h), breathing time (t), and energy (e). These
equations were derived using empirical data and curve
fitting to estimate the lung capacity:
100(0.1524h − 0.021a − 4.65)t
100(0.1247h − 0.021a − 3.59)t.
in the lung capacity calculations in addition to the breathing
time. Through analyzing the statistical model of lung capacity
estimation, it is clear that signal energy plays an important role
in lung capacity calculation, since it refers to the power of the
breath signal. Essentially, this means that the actual power and
depth of breath within a certain time of air blow forms the basis
of lung capacity estimation , .
Monitoring breath and identifying breathing movements via
a microphone to detect and classify breathing movements is the
goal of this component. Mel-Frequency Cepstral Coefficients
(MFCCs) are coefficients that represent audio signal charac-
teristics according to the human ear perception. It is based
on the known variation of the human ear’s critical bandwidths
with frequency. MFCCs are extracted through a series of steps
ABUSHAKRA AND FAEZIPOUR: AUGMENTING BREATH REGULATION USING A MOBILE DRIVEN VRT FRAMEWORK749
Linear threshold of the sixth MFCC value for inhale and exhale
that mainly include filtering, windowing, Fast Fourier Trans-
form (FFT), and Discrete Cosine Transform (DCT) computa-
tions . We employ MFCCs ,  along with speech
segmentation techniques using VAD and linear thresholding to
the acoustic signal of breath captured using a microphone to
depict the differences between inhale and exhale in frequency
puted and plotted. The inhale and exhale phases are differen-
tiated using the sixth MFCC order, which carries important
classification information. Experimental results on a number
of individuals verify this classification methodology , .
The following steps show the classification procedure.
1) The recordedacoustic
speaker/subject is split into speech and silence segments.
The splitting process was implemented using the VAD
technique, as described earlier.
2) The 13 MFCC layers for each speech/voiced segment of
the same speaker are calculated.
3) The sixth MFCC for the speech breathing segments of
each speaker are further analyzed using the linear thresh-
olding method of averages, where the inhale and exhale
utterances are differentiated from being above/below the
threshold (See Fig. 4). These differentiated MFCCs are
then related back to the speech segments to mark which
segment was corresponding to inhale and which one was
In this framework, the visualization component displays a
virtually-real animation of the lungs in real time that encour-
ages the patient to regulate his/her breath (see Figs. 1 and 5).
The patient interacts with 3-D animations that virtually resem-
ble real lung cells , . Breath movements such as inhale
and exhale are to be identified and the user is encouraged to reg-
ulate his/her breath through the framework. These movements
may also virtually navigate the patient’s cancer cells to certain
directions.Asthepatientismanipulating his/herown cancerous
Fig. 5. High quality animations for real lung cells .
cells in a relaxing virtual environment, he/she is actually prac-
that is breathing treatments to increase oxygen intake .
The modeling and visualization of human lungs in this study
will be performed using high-definition 3-D animations. In gen-
eral, many different methods and techniques exist for generat-
ing high-definition 3-D animations of the human lung. These
methods start with the knowledge of lung anatomy, followed by
tion of lung anatomy and a 3-D atlas using HRCT volume data
is presented for modeling and visualizing human lungs. Model-
ing of real-time deformations of 3-D high-resolution polygonal
model for the visualization component ensures high-quality an-
imations which allows the patient experience the VRT as if it
were for real.
IV. DEVELOPMENTAL VIRTUAL THERAPY FRAMEWORK
The development of our highly feasible and portable virtual
reality therapy framework on a smartphone device customized
to monitor breathing movements and assist in breath regulation
starts with the Application Programming Interface specification
intended to be used as an interface for the framework com-
ponents to communicate with one another (see Fig. 6). The
Data Source contains the patient information including the age,
gender, height, and the cancer stage which passes through the
Data Analysis block to the Mobile Element where the acoustic
breathing signal of the user/patient is captured.
The user/patient would initially enter a virtual environment
by activating an application on a smartphone that is to be devel-
oped by integrating the system components: breathing move-
ment classification component, lung capacity estimation com-
ponent, and the visualization component (Fig. 2). A goggle may
be used to enhance 3-D visibility and also to detach the patient
from surrounding distractions. As a result, the user will then
observe high-quality animations of the human body through
the goggle. The Operations Management block manages the
system operations starting from generating an avatar of the pa-
tient around the chest area, gradually penetrating through skin,
muscles, and finally approaching the lungs. The framework
zooms on the lung organs and also visualizes the tumor forming
750 IEEE JOURNAL OF BIOMEDICAL AND HEALTH INFORMATICS, VOL. 18, NO. 3, MAY 2014
Fig. 6.Developmental framework of virtual reality therapy system.
Fig. 7. State diagram of the breathing exercises.
the cancer virtually for patients with cancer. When the patient
Various breath movements are to be identified/classified in
this framework and each movement is to be reflected differently
on the animation as the lung capacity is calculated in the Exe-
cution Environment. The Generate Environment block uses the
Environment Management block to integrate the input and guar-
antee the runtimefor theapplication. Ifthereisany interruption
that happens to the system, it is handled by the Function Filter
block. The overall functionality will be performed by permit-
ting patients to see a virtually real image of their lungs while
they are breathing, so that when their inhale/exhale is less than
normal, they will be motivated to take their next coming breath
moreefficiently. This,inturn,willleadtoincreasing theoxygen
percentage in their blood.
The real-time functionality of this framework is critical to
synchronize the breathing movement classification, lung capac-
ity estimation and the animations the patient will see on the
screen. This is done by splitting the captured acoustic signal
into small chunks of data and storing it in the memory units of
the device (smartphone) and then processing only small chunks
at a time with a small delay (less than a portion of a second),
which makes it feel as if it were functioning in real time for the
The state diagram in Fig. 7 shows the flow of the events in
the framework. The flow of events in each breathing exercise
session is described as follows.
1) (S0) The Start state is the initial state at the beginning
of the breathing exercise session. The initial lung volume
(LV) is initialized to zero (LV = 0) in this state.
2) (S1) Inthe Capture state,the acoustic signal of respiration
of the patient is being captured via a microphone (smart-
phone embedded microphone).
3) (S2) The Classification state is the state where the breath-
ing movements are classified into inhale or exhale.
cycle in the VRT app.
Screenshot of the lung capacity parameters for each breathing exercise
4) (S3) Then, in the Lung Capacity state, the amount of air
that is inhaled or exhaled to/from the lungs is computed.
This value is denoted as LVnew.
gen represented in the lung is computed by summing up
the lung volume values from the beginning of the breath-
has not yet reached the target limit (which can be prede-
fined in each breathing exercise session), the system will
return to state (S1); the Capture state, otherwise it will go
to the End state.
6) (S5) The End state is where the breathing exercise session
ABUSHAKRA AND FAEZIPOUR: AUGMENTING BREATH REGULATION USING A MOBILE DRIVEN VRT FRAMEWORK751
while performing the breathing exercise session. (d) Lungs in maximum size at the end of the session.
Screenshot of the breathing exercise session in the VRT app. (a) User Information. (b) Lungs before starting the breathing exercise session. (c) Lungs
Figs. 8 and 9 show the screenshot of an early-phase develop-
ment of our VRT on a smartphone app. These Figures illustrate
the states of the VRT for one breathing exercise session, pro-
viding quantitative measurement of the progress.
and cancer stage) collected from the user/patient.
Fig. 9(b) shows the screenshot of the lungs in the app right
before the patient starts breathing in the breathing exercise ses-
sion. In this state, the air volume of the lung is in its smallest
size (LV = 0).
Fig. 8 shows the list of parameters calculated during the pa-
signal of the patient’s breath through the smartphone micro-
phone. These parameters are as follows.
1) Current lung size: the air volume of the current breathing
cycle (S3) computed from (1) or (2).
the patient would like to achieve in the breathing session.
It is normally set to a predefined default value based on
the users goal or patient’s cancer severity stage.
3) Total lung capacity: the accumulated value of the lung
volume computed in each breathing cycle of the session
separated by commas.
Fig. 9(c) shows the screenshot of the lungs in the app in the
middle of the breathing exercise session. In this intermediate
stage, it is observed that the lungs are inflated as compared to
the initial stage, as the air volume has increased. The previous
values and parameters are also shown in this Figure.
Fig. 9(d) shows the screenshot of the lungs in the app at its
maximum size at the end of the breathing exercise session. The
patient continues to perform the breathing exercise until he/she
has gained the intended lung capacity during the session.
Promising results (>85% accuracy) on a number of subjects
(20–125 subjects) motivated us to deploy the microphone em-
bedded within a smartphone for lung capacity estimation and
breathing movement classification –.
of the intended virtual therapy platform. The various compo-
nents of the framework are integrated together efficiently for a
high-performance real-time functionality.
V. CONCLUSION AND FUTURE DIRECTIONS
patients with asthma) usually go through severe breathing con-
ditions. However, a customized virtual reality therapy environ-
ment which allows users/patients to virtually visit their lung
organs/cells and encourages them to regulate their breath, ap-
pears to be a promising approach for any general user as well as
patients to empower their immune system and eventually fight
off the disease.
This paper introduced a conceptual framework of a
smartphone-based virtual reality environment that deals with
analyzing acoustic breathing signals in real time to monitor
respiration movements. This work suggests that if such a frame-
work were to be fully implemented, the proposed components
should be in place in order to have it functional. In continu-
ation of this research, we intend to fully implement and test
the virtual reality platform to assist users regulate their breath
by integrating the proposed framework with a high-quality ani-
mated application on a smartphone. As breathing therapy is an
established supplementary treatment method for lung cancer,
the presented VRT has the potential to further optimize/regulate
breathing for patients with lung cancer and in general by pro-
viding a daily basis estimate of their lung size using a hand-held
device at home.
752 IEEE JOURNAL OF BIOMEDICAL AND HEALTH INFORMATICS, VOL. 18, NO. 3, MAY 2014 Download full-text
In the future, when fully functional, the proposed breathing
on lung cancer patients or those with breathing disorders for a
long period of time, which is clearly beyond the scope of this
 Cancer Facts facts and Figuresfigures. (2013). [Online]. Available:
 B. S. Siegel, Love, Medicine & Miracles: Lessons Learned About Self-
Healing From a Surgeon’s Experience With Exceptional Patients.
York, NY, USA: HarperCollins Publishers, 1990.
 D. A. Becker and A. Pentland, “Using a virtual environment to teach
cancer patients t’ai chi, relaxation and self-imagery,” Tech. Rep. no. 390,
Massachusetts Inst. Technol. 1997.
 S. E. R. Sims, “Relaxation training as a technique for helping patients
cope with the experience of cancer: Aselective review of the literature,”
J. Adv. Nursing, vol. 12, no. 5, pp. 583–591, 1987.
 D. A. Becker and A. Pentland, “Staying alive: A virtual reality visual-
ization tool for cancer patients,” in Proc. AAAI Workshop Entertainment
Alife/AI, 1996, pp. 17–21.
 J. Wong, E. Lyn, E. Wilson, G. Lowe, M. Sharpe, J. Robertson,
A. Martinez, and E. Aird, “The application of breathing control for treat-
ment of lung cancer with chartwel,” in Proc. 22nd Annu. IEEE Int. Conf.
Eng. Med. Biol., 2000, vol. 4, pp. 2741–2743.
 W. Sa, M. Zhengli, Z. Changhai, G. Huili, L. Shanshan, and C. Beibei, “A
new method of virtual reality based on unity3D,” in Proc. 18th IEEE Int.
Conf. Geoinformat., 2010, pp. 1–5.
 E. A. Suma, D. M. Krum, and M. Bolas, “Sharing space in mixed and
virtual reality environments using a low-cost depth sensor,” in Proc. IEEE
Int. Sympo. Virtual Reality Innovation, 2011, pp. 349–350.
obesity and diabetes: Industry perspective,” J. Diabetes Sci. Technol.,
vol. 5, no. 2, pp. 277–282, 2011.
 S. M. Nordin, S. Sulaiman, D. Rambli, W. Ahmad, and A. Mahmood,
“A conceptual framework for teaching technical writing using 3D virtual
 C. Srisawatsakul, “Measuring thai consumers acceptance of free-
application advertisement in android and iOS device: A conceptual
model,” in Proc. 1st Int. Conf. Mobility for Life: Technol. Telecommun.
Problem Based Learning, 2012, pp. 1–7.
 L. Huakun, S. Hongzhi, F. Yi, C. Xu, and Z. Zichao, “A remote usability
testing platform for mobile phones,” in Proc. IEEE Int. Conf. Comput.
Sci. Autom. Eng., 2011, pp. 312–316.
 J. L. Olson, D. M. Krum, E. A. Suma, and M. Bolas, “A design for a
smartphone-based head mounted display,” in Proc. IEEE Virtual Reality
Conf., 2011, pp. 233–234.
 H. Aboalsamh, H. Al Hashim, F. Alrashed, and N. Alkhamis, “Virtual
11th Int. Conf. Bioinformat. Bioeng., 2011, pp. 143–147.
 B. Lange, S. Flynn, A. Rizzo, M. Bolas, M. Silverman, and A. Huerta,
“Breath: A game to motivate the compliance of postoperative breathing
exercises,” in Proc. Virtual Rehabil. Int. Conf., 2009, pp. 94–97.
 D. P. Jang, J. H. Ku, Y. H. Choi, B. K. Wiederhold, S. W. Nam, I. Y. Kim,
and S. I. Kim, “The development of virtual reality therapy (VRT) system
for the treatment of acrophobia and therapeutic case,” IEEE Trans. Inf.
Technol. Biomed., vol. 6, no. 3, pp. 213–217, Sep. 2002.
Technol. Inform., vol. 44, pp. 87–94, 1997.
 N. C. Durham, “Virtual reality helps breast cancer patients cope with
 P. Hult, B. Wranne, and P. Ask, “A bioacoustic method for timing of
the different phases of the breathing cycle and monitoring of breathing
frequency,” Med. Eng. Phys., vol. 22, pp. 425–433, 2000.
 J. Kroutil and M. Husaik, “Detection of breathing,” in Proc. 7th IEEE Int.
Conf. Adv. Semicond. Devices Microsyst., 2008, pp. 167–170.
and Applications, 2nd ed.New York: Wiley, 2006.
 A. Abushakra and M. Faezipour, “An automated approach towards esti-
mating lung capacity from respiration sounds,” in Proc. 1st IEEE Health-
care Technol. Conf.: Translational Eng. Health Med., 2012, pp. 232–235.
 A. Abushakra and M. Faezipour, “Lung capacity estimation through
acoustic signal of breath,” in Proc. 12th IEEE Int. Conf. BioInformat,
BioEng., 2012, pp. 386–391.
 A. Abushakra, M. Faezipour, and A. Abumunshar, “Efficient frequency-
based classification of respiratory movements,” in Proc. IEEE Int. Conf.
Electro/Inf. Technol., 2012, pp. 1–5.
 A. Abushakra and M. Faezipour, “Acoustic signal classification of breath-
Inf., vol. 17, no. 2, pp. 493–500, Mar. 2013.
 F. G. Hamza-Lup, A. P. Santhanam, C. Imielinska, S. L. Meeks, and
J. P. Rolland, “Distributed augmented reality with 3-d lung dynamics—A
planning tool concept,” IEEE Trans.Inf. Technol. Biomed., vol. 11, no. 1,
pp. 40–46, Jan. 2007.
 Nucleus: Medical Video, Animation & Illustration, (2013). [Online].
 Breathing Treatments for Lung Cancer, (2013). [Online]. Available:
 M. S. Brown, J. G. Golding, R. D. Suh, M. F. McNitt-Gray, J. W. Sayre,
and D. R. Aberle, “Lung micronodules: Automated method for detection
lung,” in Proc. 2nd Int. Symp. 3D Data Process. Vis. Transmiss., 2004,
 A. P. Santhanam, C. Imielinska, P. Davenport, P. Kupelian, and
J. P. Rolland, “Modeling real-time 3-D lung deformations for medical
visualization,” IEEE Trans. Inf. Technol. Biomed., vol. 12, no. 2, pp. 257–
270, Mar. 2008.
Ahmad Abushakra received the B.S. degree in in-
formation technology from Philadelphia University,
in 2004, the M.S. degree in management informa-
tion system from the Arab Academy for Banking and
Finance Science, Amman, Jordan, in 2007, and the
Ph.D. degree in computer science and engineering
from the University of Bridgeport, Bridgeport, CT,
USA, in 2013.
He was with The Arab Education Forum as Web
Developer until August 2008. In September 2008, he
became an Information Technology Manager for The
Arab Education Forum. His research interests include smartphones content in-
formation, management systems, virtual reality systems development, mobile
web application systems, and web technologies.
gree in electrical engineering from the University of
Tehran, Tehran, Iran, and the M.Sc. and Ph.D. de-
grees in electrical engineering from the University of
Texas at Dallas, Richardson, TX, USA.
and Engineering and Biomedical Engineering at the
University of Bridgeport (UB), Bridgeport, CT, USA
and the Director of the Digital/Biomedical Embed-
ded Systems and Technology (D-BEST) Lab since
July 2011. Prior to joining UB, she has been a Post-
Doctoral Research Associate at the University of Texas at Dallas collaborating
signal processing and behavior analysis techniques, high-speed packet process-
ing architectures, and digital/embedded systems.
Dr. Faezipour is a member of the IEEE EMBS and IEEE Women in