A simplified continuous-time mathematical model of K-winners-take-all (KWTA) neural circuit described by state equation with discontinuous right-hand side and output equation is presented. There is given a functional block diagram of the circuit composed for N inputs of N feedforward and one feedback hardlimiting neurons used to determine the dynamic shift of inputs. The circuit can process any finite value distinct signals with any specified minimal speed that is controlled by its single parameter, it does not require resetting with corresponding supervisory circuit and possesses signal ordering preserving property. Simulation results demonstrate good performance of the model.
Performance evaluation of distributed system is always an intricate undertaking where system behavior is normally distributed among several components those are physically distributed. Bearing this concept in mind, we delineate a performance modeling framework for a distributed system that proposes a transformation process from high level UML notation to SPN model and solves the model for relevant performance metrics. To capture the system dynamics through our proposed framework we outline a specification style that focuses on UML collaboration and activity as reusable specification building blocks, while deployment diagram identify the physical components of the system and the assignment of software artifacts to those identified system components. Optimal deployment mapping of the software artifacts on the physically available system resources is investigated by deriving the cost function. The proposed performance modeling framework provides transformation rules of UML elements into corresponding SPN representations and also the prediction result of a system such as throughput. The applicability of our proposed framework is demonstrated in the context of performance modeling of a distributed system.
Ciphertext policy attribute based encryption is an encryption technique where the data is encrypted according to an access policy over attributes. Users who have a secret key associated with a set of attributes which satisfy the access policy can decrypt the encrypted data.
However, one of the drawbacks of the CP-ABE is that it does not support updating access control policies without decrypting the encrypted data.We present a new variant of the CP-ABE scheme called ciphertext policy attribute based proxy re-encryption (CP-ABPRE). The proposed scheme allows to update the access control policy of the encrypted data without decrypting the ciphertext. The scheme uses a semitrusted entity called proxy to re-encrypt the encrypted data according to a new access control policy such that only users who satisfy the new policy can decrypt the data. The construction of our scheme is based on prime order bilinear groups. We give a formal definition for semantic security and provide a security proof in the generic group model.
Based on the recognition of context for physics teaching in current middle schools, this paper reports a web-based virtual lab (IWVL) for junior high schools' physics experimental teaching. The theoretical foundations for system design, system architecture, functional characteristics, experimental platforms, and four usage scenarios of IWVL are highlighted in the discussions.
The increasing demand of personal health monitoring products with long battery life had forced designers to use of those circuits
which consumes low power. Operational Transconductance Amplifier (OTA) operating in subthreshold (weak inversion) region introduces
a versatile solution for the realization of low power VLSI building blocks. This paper demonstrates a modified OTA with high
linearity and better performance achieved by using High-swing improved-Wilson current mirror for low power and low-frequency
applications. The achieved linearity is about ± 1.9 volt and unity gain bandwidth (UGB) of 342.30 KHz. The OTA is operated
at power supply of 0.9 volt and consumes power in range of nanowatts. The OTA simulation has been performed in a standard
TSMC 0.18 micrometer technology on BSIM 3v3 model using ELDO Simulator.
KeywordsBulk-input-Low supply voltage-Linear range-subthreshold OTA-improved Wilson current mirror
The workshop Capturing Ambient Assisted Living Needs has combined several scientists from the area and it built upon the diverse
methods that are used to capture the needs of users related to ambient assisted living environments.
KeywordsAmbient assisted living-methods-user centeredness
The future “Internet of Services” (IoS) will provide an open environment allowing market participants to offer and consume
services over Internet marketplaces. It gives businesses the opportunity to outsource parts of their business processes. This
leads to networks of cooperating businesses with a distributed execution of processes and provides a good support for inter-enterprise
modeling. Many methods have been proposed to describe such processes, however most only focus on certain aspects and fall
short of others. We present ePASS-IoS, a unified approach to describe processes and service choreographies with well-defined
execution and verification semantics. With the formulation of the well-known workflow and interaction patterns in ePASS-IoS,
we show that its expressiveness is adequate. To clearly define the semantics of the language, we formalize it using a process
KeywordsBusiness Process Modeling–Subject Oriented Modeling–ePASS–Formal Semantics–
Currently and in recent years several international initiatives specifically oriented to put together small and medium enterprises,
processes and agile methods have been identified. Likewise, different studies have identified the mapping between agile methodologies
and software development process models like CMMI-DEV and ISO/IEC 12207, but the studies related to ISO/IEC 12207 are based
on the 1995 version. Therefore this work focuses on the relationship between agile practices, especially SCRUM, and a process
subset from the 2008 version of the ISO/IEC 12207 standard. SCRUM is one of the most popular agile methods and is an incremental
iterative process. These two characteristics mean dividing the project into phases or iterations and incremental delivery
of the project. The relationships indicated in the work are obtained from the analysis of previous works and consulting experience
at 25 enterprises that comply with the standard outcomes implementing agile methodologies. The main purpose of the study is
to know the extent to which agile practices help in the implementation of practices indicated in this process model.
The CMMI-ACQ and the ISO/IEC 12207:2008 are process reference models that address issues related to the best practices for
software product acquisition. With the aim of offering information on how the practices described in these two models are
related, and considering that the mapping is one specific strategy for the harmonization of models, we have carried out a
mapping of these two reference models for acquisition. We have taken into account the latest versions of the models. Furthermore,
to carry out this mapping in a systematic way, we defined a process for this purpose. We consider that the mapping presented
in this paper supports the understanding and leveraging of the properties of these reference models, which is the first step
towards harmonization of improvement technologies. Furthermore, since a great number of organizations are currently acquiring
products and services from suppliers and developing fewer and fewer of these products in-house, this work intends to support
organizations which are interested in introducing or improving their practices for acquisition of products and services using
KeywordsHarmonization of improvement technologies-Mapping-Software product acquisition-CMMI-ACQ-ISO/IEC 12207
The aim of this paper remains to introduce the various design and runtime aspects of a state-of-the-art control architecture
adopted for the development of an autonomous underwater vehicle (AUV-150), capable of operating up to a depth of 150 meters
to perform sea-bed mapping and collecting oceanographic data. The system control architecture has been presented as an ensemble
of both hardware and software modules organized in a well-connected framework for effective operation. Various specifications,
harnessing layout and design issues have been discussed in this paper.
Software companies which have been involved in a process improvement programme according to ISO/IEC 15504 have already performed some steps in order to implement ISO/IEC 27000 as an information security management framework. After analysing in depth the existing relations between ISO/IEC 15504-5 base practices and ISO/IEC 27002 security controls, in this paper the security controls covered by the ISO/IEC 15504-5 processes are described, the changes over these processes which would be necessary for the implementation of the controls are detailed and an ISO/IEC 15504 Security Extension that facilitates the implementation of both standards is presented.
Keywords: ISO/IEC 15504 (SPICE); ISO/IEC 27000; Information security; Software Process Improvement (SPI)
The recent release of CMMI V1.3 incorporates a number of changes to the model and framework; one of the more interesting is
the decision to do away with Capability Levels 4 and 5 in the Continuous Representation, while retaining the high levels of
Organizational Maturity. This paper examines some of the issues that may have driven this decision, and explores the opportunity
provided for greater interaction between CMMI and ISO/IEC 15504.
KeywordsISO/IEC 15504–SPICE–CMMI–process capability–high maturity
The following paper deals with several aspects regarding positioning technology and application design in order to provide
a seamless system for a region consisting of many areas with various characters. Beginning with an evident example for motivation,
basic demands will be shown. Some classifications lead to specific requirements and are followed by a distinction between
the primary area types indoor and outdoor. In the end two approaches for common problems in this field will be presented –
one regarding to data structuring and handling, another for position determination in several areas.
Ambient intelligence is not only limited to rooms and buildings. In the future whole cities will become intelligent environments
– with people networking with each other, dating, finding interesting places (e.g. restaurants, museums, meeting places),
using public transportation or dealing with traffic and parking problems. In such a city, millions of inhabitants interact
with each other and benefit from information other people or sensors provide. It feels just like a village where somebody
always helps finding a restaurant, bar or theatre, were the citizen’s choices, moves and opinions influence urban planning
and public intervention. Such a city and its applications can be realized by combining two major trends in mobile computing:
Ambient Intelligence and Web 2.0. In this workshop we will be looking for technologies – present and upcoming – that can make
Wiki-City real: Technologies interconnecting people, places, events, opinions and digital online content.
The primary goal of this paper is to study whether WEB 2.0 tools such as blogs, wikis, social networks and typical hypermedia as well as techniques such as lip – reading, video – sign language and learning activities are appropriate to use for learning purpose for deaf and hard of hearing people. In order to check the extent in which the choices mentioned above are compatible with the features of the specific group and maximize the learning results we designed an empirical study which will be presented below. The study was conducted in the context of SYNERGIA, a project of Leonardo da Vinci of Lifelong Learning Programme, in the section of MULTILATERAL PROJECTS TRANSFER OF INNOVATION, The evaluation was conducted on data that came up through questionnaire analysis.
Through focus group interview, this paper carried out a case study in a secondary school in Hong Kong on the use of Web 2.0
technologies among students, parents, and teachers. Findings suggest that there was no divide in terms of access and usage
but a divide of web 2.0 technologies use among them. In conclusion, our research team speculated the roles that all these
stakeholders were playing and attempted to describe them as: naughty insiders, worried outsiders, and invisible monitors.
Keywordsdigital divide–Web 2.0–technology integration
Successful professional realization requires not only specific technical competences of engineers, but also suitable social
behavior. In this paper the essential social competences for engineers are investigated taking into consideration the competences
proposed in current research published in scientific papers and gathered opinion of students from Technical University – Sofia.
The findings lead to creation of social competences research model including key social competences: communication, collaboration,
networking, self-management, adaptability, English knowledge, leading and loyalty. The characteristics of Web 2.0 as a platform
that could support competence development are summarized and a model for social competences development of engineers in Web
2.0 settings is proposed.
KeywordsWeb 2.0–social competency–research social competences model–social competences development–engineers
Although IT has been very successful in enabling distributed, collaborative learning and knowledge creation in open-source
communities, its promise in other contexts is still an open question. In this paper, we describe the deployment of a video-based
Web2.0 platform in an executive education context. The platform, which we developed, makes extensive use of video, profiling,
game dynamics, agents and network visualizations in order to capture the attention and involvement of the learning community
members. Our goal was to provide executive education participants with an attractive, interactive platform for extending their
learning and networking beyond the classroom. This experience has allowed us to identify three main barriers to Web2.0 inter-organizational
learning and collaboration in executive education: technological barriers, motivational barriers and the inter-organizational
Keywordscollaboration–executive education–inter-organizational–knowledge management–learning–video–Web2.0
One of the defining concepts of the contemporary Information and Communication Technologies’ business environment is the Web
2.0 phenomenon and related notions such as Social Computing, Social Web, Social Software, Social Media and User-Generated
Media. However, whenever Web 2.0 is mentioned, it is usually surrounded by vague and ambiguous concepts and definitions, mostly
a complex mixture of technical and business aspects. This situation is even more critical in the current context of convergence
between wireless and web technologies, resulting in the so-called Mobile-Internet 2.0 phenomenon. This paper proposes to shed
a light in such a fuzzy environment by proposing a classification schema for Mobile-Internet 2.0 applications using as main
categorizing criteria the type and characteristics of interaction permitted or facilitated by the applications.
KeywordsWeb 2.0–Mobile 2.0–Mobile-internet convergence–Classification schema
A large number of methods like discriminant analysis, logic analysis, recursive partitioning algorithm have been used in the
past for the business failure prediction. Although some of these methods lead to models with a satisfactory ability to discriminate
between healthy and bankrupt firms, they suffer from some limitations, often due to only give an alarm, but cannot forecast.
This is why we have undertaken a research aiming at weakening these limitations. In this paper, we propose an Exponential
Smoothing Forecasting and Pattern Recognition (ESFPR) approach in this study and illustrate how Exponential Smoothing Forecasting
and Pattern Recognition can be applied to business failure prediction modeling. The results are very encouraging, and prove
the usefulness of the proposed method for bankruptcy prediction. The Exponential Smoothing Forecasting and Pattern Recognition
approach discovers relevant subsets of financial characteristics and represents in these terms all important relationships
between the image of a firm and its risk of failure.
Tourist is a conventional kind of commuters in the urban transport system. During some mega-events, such as Olympic Games
or Expos, tourists would become the most important and sizable part of commuters in the host city. Expo 2010 Shanghai will
attract 70 millions tourists during the whole 184 days duration. The large number of tourists expected to be carried, combined
with the congested urban road network and limited parking spaces, will make it difficult for individual transport to be used
during the Expo; as such, high rates of utilization of public transport will be necessary. Hence, exploring the trip mode
choice behavior of Expo tourists is the keystone for traffic planning of Expo 2010 Shanghai, especially for the difference
of heterogeneous tourists from various departure areas. A joint model system, in the form of associating clustering analysis
and HL model, is developed in this paper to investigate the differences of trip mode choice behaviour among heterogeneous
tourist groups. The clustering analysis method is used in this paper to distinguish the various types of Expo tourists for
the choice sets are variable with the attributes of groups. Sorted by the attribute of departure area, tourists of Expo Shanghai
could be divided into three kinds: local visitors, out-of-town one-day-trip visitor and out-of-town lodging visitors. The
statistic parameters of three kind models constructed by this joint system show that this modeling method improves analytic
KeywordsExpo Tourists-choice behaviour-clustering analysis
Rate control is an important part of video coding. This paper presents an improved frame layer rate control algorithm by using
the combined frame complexity and the adjusted quantization parameter (QP). The combined frame complexity can be used to more
reasonable implement bit allocation for each frame, and the quantization parameter adjusted by the encoded frame information
also can get more accurate rate control. Experimental results show that our proposed algorithm, in comparison to the original
algorithm, reduces the act bit rate error of video sequences and gets a better average PSNR with smaller deviation.
KeywordsH.264-video coding-rate control-frame complexity-QP adjustment factor
In recent years, the developments in wireless handheld devices and networking offers the technical platform for multimedia
streaming over mobile ad-hoc networks (MANETs). However, providing QoS for multimedia streaming is quite difficult in MANETs
due to its physical and organizational characteristics. Due to the high data rate and frame size of multimedia traffic there
are times when offered load exceeds the available network capacity. This causes packet drops due to congestion and router
input queue overflow. In this paper, we propose a cross layer rate adaptation scheme that provides required rate adaptation
between applications transmission rate and network bandwidth. The proposed scheme uses the classical dual leaky bucket (DLB)
algorithm to regulate the traffic flow with guaranteed QoS in terms of end-to-end delay. Furthermore, our scheme avoid the
congestion in network by controlling the traffic flow of video sequences, this increases the overall network throughput. Our
scheme uses the encoding information with QoS requirements of data session (i.e. video stream) provided by application layer
to regulate the flow according to the available network resources. We validate our scheme in scenarios where different network
size and node mobility degrees are tested in order to show the benefits offered by our scheme. We have used the latest coding
standard H.264/SVC video traces to simulate video sources. The quality of received video is measured in terms of network metrics
such as end-to-end delay, jitter and packet loss ratio.
Keywordsad hoc networks–Quality of Service–video streaming–H.264/SVC–dual leaky bucket–cross-layer design–video traces
In the paper we show that the second round candidates of the NIST hash competition differ up to 22 % in their power consumption.
We perform a detailed analysis of the candidates with respect to different performance parameters. Finally, we discuss the
time, the power consumption, and the energy per byte as criteria to distinguish the candidates with respect to the performance.
As mobile computing devices becomes pervasive, more and more civilians deposit precious information in mobile phones than
desk top PCs especially for Global Logistics Management operators, who heavily depend on Global Position System in order to
effectively and efficiently fulfill just-in-time delivery. In this paper, an embedded Global Position System smart phone was
applied to travel along the roads trying to disclose the associate digital evidences concerning the locations that the current
user had actually been or wish to go via data mining technology. From digital forensics point of view, digital evidences essentially
play a critical and decisive role in some cybercriminal or cyber terrorism cases although the diversities of mobile phones
and the corresponding operating systems. The paper provides the generic guides and methodologies for the law enforcement agencies
or the digital forensics specialists to ponder when they deal with the similar cases.
Keywordsdigital forensics–global position system–geotagging–mobile computing device–non-volatile memory–smart phone