BookPDF Available

Advances and Innovations in Systems, Computing Sciences and Software Engineering



Advances and Innovations in Systems, Computing Sciences and Software Engineering is a collection of world class paper articles addressing the following topics: Image and Pattern Recognition: Compression, Image processing, Signal Processing Architectures, Signal Processing for Communication, Signal Processing Implementation, Speech Compression, and Video Coding Architectures. Languages and Systems: Algorithms, Databases, Embedded Systems and Applications, File Systems and I/O, Geographical Information Systems, Kernel and OS Structures, Knowledge Based Systems, Modeling and Simulation, Object Based Software Engineering, Programming Languages, and Programming Models and tools. Parallel Processing: Distributed Scheduling, Multiprocessing, Real-time Systems, Simulation Modeling and Development, and Web Applications. New trends in computing: Computers for People of Special Needs, Fuzzy Inference, Human Computer Interaction, Incremental Learning, Internet-based Computing Models, Machine Intelligence, Natural Language Processing, Neural Networks, and Online Decision Support System
Advances and Innovations in Systems, Computing Sciences
and Software Engineering
Edited by
Khaled Elleithy
Advances and Innovations
in Systems, Computing
University of Bridgeport
Sciences and Software
A C.I.P. Catalogue record for this book is available from the Library of Congress.
ISBN 978-1-4020-6263-6 (HB)
Published by Springer,
P.O. Box 17, 3300 AA Dordrecht, The Netherlands.
Printed on acid-free paper
All Rights Reserved
© 2007 Springer
No part of this work may be reproduced, stored in a retrieval system, or transmitted
in any form or by any means, electronic, mechanical, photocopying, microfilming, recording
or otherwise, without written permission from the Publisher, with the exception
of any material supplied specifically for the purpose of being entered
and executed on a computer system, for exclusive use by the purchaser of the work.
ISBN 978-1-4020-6264-3
To my wife and sons.
Table of Contents
1. An Adaptive and Extensible Web-Based Interface System for Interactive
Video Contents Browsing
Adrien Joly and Dian Tjondronegoro
2. Design and Implementation of Virtual Instruments for Monitoring and Controlling
Physical Variables Using Different Communication Protocols
A. Montoya, D. Aristizábal, R. Restrepo, N. Montoya and L. Giraldo
3. Online Decision Support System for Dairy Farm
A. Savilionis, A. Zajančkauskas, V. Petrauskas and S. Juknevičius
4. Decision Making Strategies in Global Exchange and Capital Markets
Aleksandras Vytautas Rutkauskas and Viktorija Stasytyte
5. A Simple and Efficient Solution for Room Synchronization Problem
in Distributed Computing
Alex A. Aravind
6. Improving Computer Access for Blind Users
Amina Bouraoui and Mejdi Soufi
7. Developing a Multi-Agent System for Dynamic Scheduling Trough AOSE Perspective
Ana Madureira, Joaquim Santos, Nuno Gomes and Ilda Ferreira
8. Criminal Sentencing, Intuition and Decision Support
Andrew Vincent, Tania Sourdin and John Zeleznikow
9. An Approach for Invariant Clustering and Recognition in Dynamic Environment
Andrey Gavrilov and Sungyoung Lee
10. Modelling non Measurable Processes by Neural Networks: Forecasting
Cèze Basin (Gard - France)
A. Johannet, P.A. Ayral and B. Vayssade
11. Significance of Pupil Diameter Measurements for the Assessment of Affective State
in Computer Users
Armando Barreto, Jing Zhai, Naphtali Rishe and Ying Gao
12. A Novel Probing Technique for Mode Estimation in Video Coding Architectures
Ashoka Jayawardena
13. The Effects of Vector Transform on Speech Compression
B.D. Barkana & M.A. Cay
14. Software Development Using an Agile Approach for Satellite Camera Ground
Support Equipment
D. dos Santos, Jr., I. N. da Silva, R. Modugno, H. Pazelli and A. Castellar
Underground Flow Case Study of the
15. Priming the Pump: Load Balancing Iterative Algorithms
David J. Powers
16. An Ontology for Modelling Flexible Business Processes
Denis Berthier
17. Routing Free Messages Between Processing Elements in a Hypercube
with Faulty Links
Dinko Gichev
18. OPTGAME: An Algorithm Approximating Solutions for Multi-Player Difference Games
Doris A. Behrens and Reinhard Neck
19. Rapid Development of Web Applications with Web Components
Dzenan Ridjanovic
20. Mesh-adaptive Methods for Viscous Flow Problem with Rotation
E. Gorshkova, P. Neittaanmaki and S.Repin
21. Metamodel-based Comparison of Data Models
Erki Eessaar
22. BEMGA: A HLA Based Simulation Modeling and Development Tool
Ersin Ünsal, Fatih Erdoğan Sevilgen
23. Comparison of Different POS Tagging Techniques (n-gram, HMM and Brill’s tagger)
for Bangla
Fahim Muhammad Hasan
24. Real-Time Simulation and Data Fusion of Navigation Sensors for Autonomous
Aerial Vehicles
Francesco Esposito, Domenico Accardo, and Antonio Moccia
25. Swarm-based Distributed Job Scheduling in Next-Generation Grids
Francesco Palmieri and Diego Castagna
26. Facial Recognition with Singular Value Decomposition
Guoliang Zeng
27. The Application of Mobile Agents to Grid Monitor Services
Guoqing Dong and Weiqin Tong
28. Expanding the Training Data Space Using Bayesian Test
Hamad Alhammady
29. A Multi-Agent Framework for Building an Automatic Operational Profile
Hany EL Yamany and Miriam A.M. Capretz
30. An Efficient Interestingness Based Algorithm for Mining Association Rules
in Medical Databases
Siri Krishan Wasan, Vasudha Bhatnagar and Harleen Kaur
31. NeSReC: A News Meta-Search Engines Result Clustering Tool
Hassan Sayyadi, Sara Salehi and Hassan AbolHassani
32. Automatic Dissemination of Text Information Using the EBOTS System
Hemant Joshi and Coskun Bayrak
33. Mapping State Diagram to Petri Net : An Approach to Use Markov Theory
H. Motameni, A. Movaghar, M. Siasifar, M. Zandakbari and H. Montazeri
34. A Distributed Planning & Control Management Information System for Multi-site
Ioannis T. Christou and Spyridon Potamianos
35. Supporting Impact Analysis by Program Dependence Graph Based Forward Slicing
Jaakko Korpi and Jussi Koskinen
36. An Analysis of Several Proposals for Reversible Latches
J. E. Rice
37. Implementation of a Spatial Data Structure on a FPGA
J. E. Rice, W. Osborn and J. Schultz
38. Security Management: Targets, Essentials and Implementations
Zhao Jing and Zheng Jianwu
39. Application of Fuzzy Set Ordination and Classification to the Study of Plant
Jin-tun Zhang and Dongpin Meng
40. On Searchability and LR-Visibility of Polygons
John Z. Zhang
41. Swarm Intelligence in Cube Selection and Allocation for Multi-Node OLAP Systems
Jorge Loureiro and Orlando Belo
42. Developing Peer-to-Peer Applications with MDA and JXTA
José Geraldo de Sousa Junior and Denivaldo Lopes
43. A Case Study to Evaluate Templates & Metadata for Developing Application Families
José Lamas Ríos and Fernando Machado-Píriz
44. Application of Multi-Criteria to Perform an Organizational Measurement Process
Josyleuda Melo Moreira de Oliveira, Karlson B. de Oliveira, Ana Karoline A. de Castro,
Plácido R. Pinheiro and Arnaldo D. Belchior
45. Institutionalization of an Organizational Measurement Process
Josyleuda Melo Moreira de Oliveira, Karlson B. de Oliveira, Arnaldo D. Belchior
for Analyzing Non-functional Parameters
Communities in Pangquangou Nature Reserve, China
46. Decomposition of Head Related Impulse Responses by Selection of Conjugate Pole Pairs
Kenneth John Faller II, Armando Barreto, Navarun Gupta and Naphtali Rishe
47. GIS Customization for Integrated Management of Spatially Related Diachronic Data
K. D. Papadimitriou and T. Roustanis
48. BlogDisc: A System for Automatic Discovery and Accumulation of Persian Blogs
Kyumars Sheykh Esmaili, Hassan Abolhassani and Zeinab Abbassi
49. Fuzzy Semantic Similarity Between Ontological Concepts
Ling Song, Jun Ma, Hui Liu, Li Lian and Dongmei Zhang
50. Research on Distributed Cache Mechanism in Decision Support System
LIU Hui and JI Xiu-hua
51. Research on Grid-based and Problem-oriented Open Decision Support System
Liu Xia, Xueguang Chen, Zhiwu Wang and Qiaoyun Ma
52. Development and Analysis of Defect Tolerant Bipartite Mapping Techniques for
Programmable Cross-points in Nanofabric Architecture
Mandar Vijay Joshi and Waleed Al-Assadi
53. Nash Equilibrium Approach to Dynamic Power Control in DS-CDMA Systems
J. Qasimi M and M. Tahernezhadi
54. Natural Language Processing of Mathematical Texts in mArachna
Marie Blanke, Sabina Jeschke, Nicole Natho, Ruedi Seiler and Marc Wilke
55. Humanization of E-Services: Human Interaction Metaphor in Design of E-Services
Mart Murdvee
56. Introducing the (POSSDI) Process
Mohammad A. ALGhalayini and Abad Shah
57. Infrastructure for Bangla Information Retrieval in the Context of ICT for Development
Nafid Haque, M. Hammad Ali, Matin Saad Abdullah and Mumit Khan
58. An Improved Watermarking Extraction Algorithm
Ning Chen and Jie Zhu
59. Building Knowledge Components to Enhance Frequently Asked Question
Noreen Izza Arshad, Savita K. Sugathan, Mohamed Imran M. Ariff and Siti Salwa A. Aziz
60. Semantic Representation of User’s Mental Trust Model
Omer Mahmood and John D Haynes
61. Access Concurrents Sessions Based on Quorums
Ousmane THIARE, Mohamed NAIMI and Mourad GUEROUI
62. A Dynamic Fuzzy Model for Processing Lung Sounds
P.A. Mastorocostas, D.N. Varsamis, C.A. Mastorocostas and C.S. Hilas
63. A Formal Specification in JML of Java Security Package
Poonam Agarwal, Carlos E. Rubio-Medrano, Yoonsik Cheon and Patricia. J Teller
64. Enterprise Integration Strategy of Interoperability
Raymond, Cheng-Yi Wu and Jie Lu
65. A Method for Consistent Modeling of Zachman Framework Cells
S. Shervin Ostadzadeh, Fereidoon Shams Aliee and S. Arash Ostadzadeh
66. Beyond User Ranking: Expanding the Definition of Reputation in Grid Computing
Said Elnaffar
67. A Comparative Study for Email Classification
Seongwook Youn and Dennis McLeod
68. Noise Reduction for VoIP Speech Codecs Using Modified Wiener Filter
Seung Ho Han, Sangbae Jeong, Heesik Yang, Jinsul Kim, Won Ryu, and Minsoo Hahn
69. A Formal Framework for “Living” Cooperative Information Systems
Shiping Yang and Martin Wirsing
70. Crime Data Mining
Shyam Varan Nath
71. Combinatorial Hill Climbing Using Micro-Genetic Algorithms
Spyros A. Kazarlis
72. Alternate Paradigm for Navigating the WWW Through Zoomable User Interface
Sumbul Khawaja, Asadullah Shah and Kamran Khowaja
73. A Verifiable Multi-Authority E-Voting Scheme for Real World Environment
T. Taghavi, M. Kahani and A. G. Bafghi
74. Stochastic Simulation as an Effective Cell Analysis Tool
Tommaso Mazza
75. Bond Graph Causality Assignment and Evolutionary Multi-Objective Optimization
Tony Wong, Gilles Cormier
76. Multi-Criteria Scheduling of Soft Real-Time Tasks on Uniform Multiprocessors
Using Fuzzy Inference
Vahid Salmani, Mahmoud Naghibzadeh, Mohsen Kahani and Sedigheh Khajouie Nejad
77. A Finite Element Program Based on Object-Oriented Framework for Spatial Trusses
Vedat TOĞAN and Serkan BEKİROĞLU
78. Design for Test Techniques for Asynchronous NULL Conventional Logic (NCL)
Venkat Satagopan, Bonita Bhaskaran, Waleed K. Al-Assadi, Scott C. Smith
and Sindhu Kakarla
79. Ant Colony Based Algorithm for Stable Marriage Problem
Ngo Anh Vien, Nguyen Hoang Viet, Hyun Kim, SeungGwan Lee
and TaeChoong Chung
80. Q-Learning Based Univector Field Navigation Method for Mobile Robots
Ngo Anh Vien, Nguyen Hoang Viet, HyunJeong Park, SeungGwan Lee and TaeChoong Chung
81. Statistical Modeling of Crosstalk Noise in Domino CMOS Logic Circuits
Vipin Sharma and Waleed K. Al-Assadi
82. A Decision Making Model for Dual Interactive Information Retrieval
Vitaliy Vitsentiy
83. Business Rules Applying to Credit Management
Vladimir Avdejenkov and Olegas Vasilecas
84. Information System in Atomic Collision Physics
V.M. Cvjetković, B.M. MarinKović and D. Šević
85. Incremental Learning of Trust while Reacting and Planning
W. Froelich, M. Kisiel-Dorohinicki and E. Nawarecki
86. Simulation of Free Feather Behavior
Xiaoming Wei, Feng Qiu and Arie Kaufman
87. Evolutionary Music Composer Integrating Formal Grammar
Yaser M.A. Khalifa, Jasmin Begovic, Badar Khan, Airrion Wisdom
and M. Basel Al-Mourad
89. On Path Selection for Multipath Connection
Yu Cai and C. Edward Chow
90. Some Results on the Sinc Signal with Applications to Intersymbol Interference
in Baseband Communication Systems
Zouhir Bahri
91. Multi-Focus Image Fusion Using Energy Coefficient Matrix 525
Adnan Mujahid Khan, Mudassir Fayyaz and Asif M. Gillani
A New Algorithm and Asymptotical Properties for the Deadlock Detection Problem
for Computer Systems with Reusable Resource Types
Youming Li and Robert Cook
92. Measuring Machine Intelligence of an Agent-Based Distributed Sensor Network System
Anish Anthony and Thomas C. Jannett
93. Image Processing for the Measurement of Flow Rate of Silo Discharge
Cédric DEGOUET, Blaise NSOM, Eric LOLIVE and André GROHENS
94. A Blind Watermarking Algorithm Based on Modular Arithmetic
in the Frequency Domain
Cong Jin, Zhongmei Zhang, Yan Jiang, Zhiguo Qu and Chuanxiang Ma
95. Determination of Coordinate System in Short-axis View of Left Ventricle
Gaurav Sehgal, Gabrielle Horne and Peter Gregson
96. On-line Modeling for Real-time, Model-Based, 3D Pose Tracking
Hans de Ruiter and Beno Benhabib
97. Grid Enabled Computer Vision System for Measuring Traffic Parameters
Ivica Dimitrovski, Gorgi Kakasevski, Aneta Buckovska, Suzana Loskovska
and Bozidar Proevski
98. Physically Constrained Neural Network Models for Simulation
J. E. Souza de Cursi and A. Koscianski
This book includes Volume I of the proceedings of the 2006 International Conference on Systems,
Computing Sciences and Software Engineering (SCSS). SCSS is part of the International Joint Conferences
on Computer, Information, and Systems Sciences, and Engineering (CISSE 06). The proceedings are a set
of rigorously reviewed world-class manuscripts presenting the state of international practice in Advances
and Innovations in Systems, Computing Sciences and Software Engineering.
SCSS 06 was a high-caliber research conference that was conducted online. CISSE 06 received 690 paper
submissions and the final program included 370 accepted papers from more than 70 countries, representing
the six continents. Each paper received at least two reviews, and authors were required to address review
comments prior to presentation and publication.
Conducting SCSS 06 online presented a number of unique advantages, as follows:
All communications between the authors, reviewers, and conference organizing committee were done
on line, which permitted a short six week period from the paper submission deadline to the beginning
of the conference.
PowerPoint presentations, final paper manuscripts were available to registrants for three weeks prior
to the start of the conference.
The conference platform allowed live presentations by several presenters from different locations,
with the audio and PowerPoint transmitted to attendees throughout the internet, even on dial up
connections. Attendees were able to ask both audio and written questions in a chat room format, and
presenters could mark up their slides as they deem fit.
The live audio presentations were also recorded and distributed to participants along with the power
points presentations and paper manuscripts within the conference DVD.
The conference organizers are confident that you will find the papers included in this volume interesting
and useful.
Khaled Elleithy, Ph.D.
Bridgeport, Connecticut
June 2007
The 2006 International Conference on Systems, Computing Sciences and Software Engineering (SCSS)
and the resulting proceedings could not have been organized without the assistance of a large number of
individuals. SCSS is part of the International Joint Conferences on Computer, Information, and Systems
Sciences, and Engineering (CISSE). I had the opportunity to co-found CISSE in 2005, with Professor
Tarek Sobh, and we set up mechanisms that put it into action. Andrew Rosca wrote the software that
allowed conference management, and interaction between the authors and reviewers online. Mr. Tudor
Rosca managed the online conference presentation system and was instrumental in ensuring that the event
met the highest professional standards. I also want to acknowledge the roles played by Sarosh Patel and
Ms. Susan Kristie, our technical and administrative support team.
The technical co-sponsorship provided by the Institute of Electrical and Electronics Engineers (IEEE) and
the University of Bridgeport is gratefully appreciated. I would like to express my thanks to Prof. Toshio
Fukuda, Chair of the International Advisory Committee and the members of the SCSS Technical Program
Committee, including: Abdelaziz AlMulhem, Alex A. Aravind, Ana M. Madureira, Mostafa Aref,
Mohamed Dekhil, Julius Dichter, Hamid Mcheick, Hani Hagras, Marian P. Kazmierkowski, Low K.S.,
Michael Lemmon, Rafa Al-Qutaish, Rodney G. Roberts, Sanjiv Rai, Samir Shah, Shivakumar Sastry,
Natalia Romalis, Mohammed Younis, Tommaso Mazza, and Srini Ramaswamy.
The excellent contributions of the authors made this world-class document possible. Each paper received
two to four reviews. The reviewers worked tirelessly under a tight schedule and their important work is
gratefully appreciated. In particular, I want to acknowledge the contributions of the following individuals:
Yongsuk Cho, Michael Lemmon, Rafa Al-Qutaish, Yaser M. A. Khalifa, Mohamed Dekhil, Babar Nazir,
Khaled Hayatleh, Mounir Bousbia-Salah, Rozlina Mohamed, A. Sima Etner-Uyar, Hussein Abbass, Ahmad
Kamel, Emmanuel Udoh, Rodney G. Roberts, Vahid Salmani, Dongchul Park, Sergiu Dumitriu, Helmut
Vieritz, Waleed Al-Assadi, Marc Wilke, Mohammed Younis, John Zhang, Feng-Long Huang, Natalia
Romalis, Hamid Mcheick, Minkoo Kim, Khaled Rasheed, Chris Panagiotakopoulos, Alex Aravind, Dinko
Gichev, Dirk Mueller, Andrew Vincent, Ana Madureira, Abhilash Geo Mathews, Yu Cai, Spyros Kazarlis,
Liu Xia, Pavel Osipov, Hamad Alhammady, Fadel Sukkar, Jorge Loureiro, Hemant Joshi, Hossam Fahmy,
Yoshiteru Ishida, Min Jiang, Vien Ngo Anh, Youming Li, X. Sheldon Wang, Nam Gyu Kim, Vasso
Stylianou, Tommaso Mazza, Radu Calinescu, Nagm Mohamed, Muhammad Ali, Raymond Wu, Mansour
Tahernezhadi, Trevor Carlson, Sami Habib, Vikas Vaishnav, Vladimir Avdejenkov, Volodymyr Voytenko,
Vygantas Petrauskas, Shivakumar Sastry, U. B. Desai, Julius Dichter, Hani Hagras, Giovanni Morana,
Mohammad Karim, Thomas Nitsche, Rosida Coowar, Anna Derezinska, Amala Rajan, Aleksandras
Vytautas Rutkauskas, A. Ismail, Mostafa Aref, Ahmed Abou-Alfotouh, Damu Radhakrishnan, Sameh
ElSharkawy, George Dimitoglou, Marian P. Kazmierkowski, M. Basel Al-Mourad, Ausif Mahmood,
Nawaf Kharma, Fernando Guarin, Kaitung Au, Joanna Kolodziej, Ugur Sezerman, Yujen Fan, Zheng Yi
Wu, Samir Shah, Sudhir Veerannagari, Junyoung Kim and Sanjiv Rai.
Khaled Elleithy, Ph.D.
Bridgeport, Connecticut
June 2007

Chapters (84)

In this Project were developed software components (Java Beans) which have the capability of communication through different communication protocols with hardware elements interconnected to sensors and control devices for monitoring and controlling different physical variables, conforming a hardware-software platform that obeys the virtual instruments design pattern. The implemented communication protocols are RS232, 1-Wire and TCP/IP with all of its annexed technologies like WiFi (Wireless Fidelity) and WiMax
online decision support system for dairy farm was created for helping Lithuanian dairy farmers, scientists, dairy technology producers, students and other peoples interesting in dairy business. It enable they use newest information and technology for planning own business
The main objective of this paper is to present the investment decision management system in exchange and capital markets – the Double Trump model. The main problems being solved with this model are named as quantitative decision search problems. Computer-imitational methods are also analysed as the main solving means for the mathematical models viewed as stochastical programming tasks in order to reflect the problems characteristics. Attention is paid to the revealing of the analytical possibilities of the decision management system and to decision methods identification, analyzing such non-traditional problems of financial engineering as three-dimensional utility function maximization in the adequate for investment decisions reliability assessment portfolio possible set of values, searching for investment decisions profitability, reliability and riskiness commensuration concept and mathematical decisions methods. Solving of the problems named above ensures sustainable investment decisions development in capital and exchange markets.
Room synchronization problem was first introduced by Joung in 1998 and widely studied subsequently. The problem arises in various practical applications that require concurrent data sharing. The problem aims at achieving exclusive access to shared data while facilitating suitable concurrency. This paper presents a new algorithm to solve room synchronization problem in shared memory systems. The algorithm is simple and easy to prove. The main appeal of our algorithm is that, in some sense, it closely emulates the traditional queuing system commonly used in practice. Also, our algorithm is distributed and offers unbounded concurrency, while assuring bounded delay, which was conjectured as unattainable in [8].
This paper discusses the development of applications dedicated to the blind users, with the help of reusable components. The methodology relies on component based development. For this purpose, braille-speech widgets adapted from classical widgets, have been studied, specified and implemented. The developed components can be used by developers to implement software for blind users. The contribution of this work in the field of assistive technology is valuable, because there are no existing tools that facilitate the creation of interfaces for the blind users, and it may considerably improve computer access for this category of users.
Agent-based computing can be considered as a new general purpose paradigm for software development, which tends to radically influence the way a software system is conceived and developed, and which calls for new agent specific software engineering approaches. This paper presents an architecture for distributed manufacturing scheduling and follows Agent Oriented Software Engineering (AOSE) guidelines trough specification defined by Ingenias methodology. This architecture is based on a Multi-Agent System (MAS) composed by a set of autonomous agents that cooperates in order to accomplish a good global solution.
The Toulmin model, although probably intended as a method of exploring arguments in a more theoretical setting, is finding itself used more and more in representing knowledge in different types of decision support whether computerized or not. The GAAM presented above shows that it is possible to represent complex knowledge in a non-dialectical manner and that decisions which are discretionary in nature requiring intuitive action can be modelled and hopefully predicted. The great benefit of this type of system comes about as it begins to make the intuitive part of sentencing more transparent and open to scrutiny. Even though the system is not designed for the judiciary to use for decision support for their own sentencing requirements, it could be used in helping to train judges and magistrates. This system, once in use by VLA, will hopefully provide a method for lawyers, both experienced and inexperienced, to make better arguments for sentences for client before the bench.
An approach for invariant clustering and recognition of objects (situation) in dynamic environment is proposed. This approach is based on the combination of clustering by using unsupervised neural network (in particular ART-2) and preprocessing of sensor information by using forward multi-layer perceptron (MLP) with error back propagation (EBP) which supervised by clustering neural network. Using MLP with EBP allows to recognize a pattern with relatively small transformations (shift, rotation, scaling) as a known previous cluster and to reduce producing large number of clusters in dynamic environment, e.g. during movement of robot or recognition of novelty in security system.
After a presentation of the nonlinear properties of neural networks, their applications to hydrology are described. A neural predictor is satisfactorily used to estimate a flood peak. The main contribution of the paper concerns an original method for visualising a hidden underground flow Satisfactory experimental results were obtained that fitted well with the knowledge of local hydrogeology, opening up an interesting avenue for modelling using neural networks.
The need to provide computers with the ability to distinguish the affective state of their users is a major requirement for the practical implementation of Affective Computing concepts. The determination of the affective state of a computer user from the measurement of some of his/her physiological signals is a promising avenue towards that goal. In addition to the monitoring of signals typically analyzed for affective assessment, such as the Galvanic Skin Response (GSR) and the Blood Volume Pulse (BVP), other physiological variables, such as the Pupil Diameter (PD) may be able to provide a way to assess the affective state of a computer user, in real-time. This paper studies the significance of pupil diameter measurements towards differentiating two affective states (stressed vs. relaxed) in computer users performing tasks designed to elicit those states in a predictable sequence. Specifically, the paper compares the discriminating power exhibited by the pupil diameter measurement to those of other single-index detectors derived from simultaneously acquired signals, in terms of their Receiver Operating Characteristic (ROC) curves.
Video compression standards operate by removing redundancy in the temporal, special, and even frequency domains. Temporal redundancy is usually removed by motion compensated prediction resulting in Inter-, Intra-, Bidirectional- frames. However, video coding standards do not specify the encoding process but the bit stream. Thus one of the key tasks of any implementations of such standards is to estimate the modes of frames as well as macro blocks. In this article we propose a novel technique for this purpose.
Until now, many techniques have been developed for speech compression. In this study, firstly, linear vector quantization is used to compress speech signals. Then, Linde-Buzo-Gray (LBG) algorithm, which is used for image compression, is adapted to speech signals for compressing process. Before this process, vector transform (VT) that is defined at second chapter is applied on speech signals. After the VT, speech vectors are coded using vectoralized LBG algorithm. Inverse VT is applied to decoded data. Obtained compression results are evaluated using graphics and SNR values.
This work presents the development of the software that controls a set of equipments, called Ground Support Equipment (GSE), which verifies requirement fulfilling and helps integration procedures of the CBERS-3 and 4 satellites’ Multispectral Camera (MUXCAM). The software development followed an iterative spiral model, with agile methods characteristics that were originally used at Opto Electronics in industrial and medical equipment’s projects. This approach allowed a small team, constituted by only four engineers, to fast create the first software version, even sharing time with GSE’s hardware development, and to keep the project on schedule, in spite of some requirement changes.
Load balancing iterative algorithms is an interesting problem in resource allocation that is useful for reducing total elapsed processing time through parallel processing. Load balancing means that each processor in a parallel processing environment will handle about the same computational load. It is not sufficient to allocate the same number of processes to each processor since different processes or tasks can require different loads [1]. For iterative algorithms, load balancing is the process of distributing the iterations of a loop to individual processes [2]. This paper will analyze different methods used for load balancing. Each method will be measured by how well it reduces the total elapsed time and by algorithm complexity and overhead. Measured data for different load balancing methods will be included in this paper.
An algorithm for routing free messages between processing elements in a multiprocessor system is proposed. As a basic architecture an n-dimensional hypercube is applied. Only one of the processors in the hypercube is connected with an external user. The external machine is called host processor. Bidirectional one-port links, some of them faulty at same time are applied. The algorithm can be applied on an arbitrary connected multiprocessor system.
We present a new numerical tool to determine solutions of non-zero-sum multi-player difference games. In particular, we describe the computer algorithm OPTGAME (version 2.0) which solves affine-quadratic games and approximates solutions for nonlinear games iteratively by using a local linearization procedure. The calculation of these solutions (open-loop and feedback Nash and Stackelberg equilibrium solutions) is sketched, as is the determination of the cooperative Pareto-optimal solution.
This paper provides a brief overview of Domain Model RAD, a web framework, which is used for developing dynamic web applications with a minimum amount of programming. Domain Model RAD uses Domain Model Lite to represent a domain model of a web application. Domain Model Lite is a framework that facilitates the definition and the use of domain models in Java. Domain Model RAD uses Wicket for web application pages and page sections. Wicket is a web framework that provides basic web components, to construct, in an object oriented way, more advanced web components. Domain Model RAD interprets the application model and creates default web pages from its web components that are based on the domain model.
In this paper, new functional type a posteriori error estimates for the viscous flow problem with rotating term are presented. The estimates give guaranteed upper bounds of the energy norm of the error and provide reliable error indication. We describe the implementation of the adaptive finite element methods (AFEM) in the framework of the functional type estimates proposed. Computational properties of the estimates are investigated on series of numerical examples.
A data model specifies the building blocks of databases, the rules how to assemble these blocks and operations that can be performed based on the built-up structures. We have to create increasingly complex applications. Properties of the underlying data model of a Database System (DBMS) determine how easy is to create an application that uses a database. There are many different data models. We have to choose a DBMS, the underlying data model of which best fulfils the needs of an application. Existing comparisons of data models are mostly based on the experiences of using one or another DBMS. This paper explains how to perform non-empirical comparison of data models by using the metamodels, which describe abstract syntax of these data models. We also present some results of the comparison of the underlying data model of SQL:2003 and the data model that is proposed in The Third Manifesto.
High Level Architecture (HLA) is a general purpose architecture, developed to support reuse and interoperability across a large number of different types of distributed simulation projects. HLA-compliant simulation development is a complex and difficult engineering process. This paper presents a case tool, named BEMGA, which aims to decrease the complexity of the process. Using BEMGA, one can easily model a distributed simulation, generate the simulation software and produce the documentation files from the model.
This paper presents an integrated navigation tool developed in the framework of an advanced study on navigation of Unmanned Aerial Vehicles. The study aimed at testing innovative navigation sensor configurations to support fully autonomous flight even during landings and other critical mission phases. The tool is composed of sensor simulation and data fusion software. The most important navigation sensors that are installed onboard an unmanned aircraft have been modeled: i.e. inertial, GPS, air data, high accuracy altimeter, and magnetometer. Their model included every non negligible error source that has been documented in the literature. Moreover, a specific sensor data fusion algorithm has been developed that integrates inertial sensor measurements with GPS and radar altimeter measurements. The paper reports on numerical testing of sensor simulator and data fusion algorithm. The algorithm was coded for real time implementation to perform hardware–in-the-loop validation and in flight tests onboard a small Unmanned Aerial Vehicle.
The computational Grid paradigm is now commonly used to define and model the architecture of a distributed software and hardware environment for executing scientific and engineering applications over wide area networks. Resource management and load balanced job scheduling are a key concern when implementing new Grid middleware components to improve resource utilization. Our work focuses on an evolutionary approach based on swarm intelligence and precisely on the ant-colony based meta-heuristic, to map the solution capability of social insects to the above resource scheduling and balancing problem, achieving an acceptable near-optimal solution at a substantially reduced complexity. The Grid resource management framework, will be implemented as a multi-agent system where all the agents communicate each other through the network and cooperate according to ant-like local interactions so that load balancing and Grid makespan/flowtime optimization can be achieved as an emergent collective behaviour of the system. We showed, by presenting some simulation results, that the approach has the potential to become really appropriate for resource balanced scheduling in Grid environments.
Grids provide a uniform interface to a collection of heterogeneous, geographically distributed resources. In recent years, the research on the Grid Monitoring System gets increasingly essential and significant. In this paper we put forward a novel Mobile Agent-based Grid Monitoring Architecture (MA-GMA), which is based on the GMA from GGF and introduces the mobile agents and cache mechanism of MDS. Based on the Open Grid Service Architecture (OGSA) standard, we merge the intelligence and mobility characteristic of mobile agent into the current OGSA to constructing a dynamic and extensible monitoring system. In the end, we do some experiments under different environments. As the results shown, this MA-GMA is proved to be effective and improve the monitoring performance greatly.
Expanding the training dataset is a new technique proposed recently to improve the performance of classification methods. In this paper, we propose a powerful method to conduct the previous task. Our method is based on applying the Bayesian test based on emerging patterns to evaluate and improve the quality of the new data instances used to expand the training data space. Our experiments on a number of datasets show that our method outperforms the previous proposed methods and is able to add additional knowledge to the space of data.
Since the early 1970s, researchers have proposed several models to improve software reliability. Among these, the operational profile approach is one of the most common. Operational profiles are a quantification of usage patterns for a software application. The research described in this paper investigates a novel multi-agent framework for automatically creating an operational profile for generic distributed systems after their release into the market. The operational profile in this paper is extended to comprise seven different profiles. Also, the criticality of operations is defined using a new composed metrics in order to organise the testing process as well as to decrease the time and cost involved in this process. The proposed framework is considered as a step towards making distributed systems intelligent and self-managing.
Mining association rules is animportant area in data mining. Massively increasing volume of data in reallife databases has motivated researchers to design novel and efficientalgorithm for association rules mining. In this paper, we propose anassociation rule mining algorithm that integrates interestingness criteriaduring the process of building the model. One of the main features of thisapproach is to capture the user background knowledge, which is monotonicallyaugmented. We tested our algorithm and experiment with some public medicaldatasets and found the obtained results quite promising.
Recent years have witnessed an explosion in the availability of news articles on the World Wide Web. In addition, organizing the results of a news search facilitates the user(s) in overviewing the returned news. In this work, we have focused on the label-based clustering approaches for news meta-search engines, and which clusters news articles based on their topics. Furthermore, our engine for NEws meta-Search REsult Clustering (NeSReC) is implemented along. NeSReC takes queries from the users and collect the snippets of news which are retrieved by The Altavista News Search Engine for the queries. Afterwards, it performs the hierarchical clustering and labeling based on news snippets in a considerably tiny slot of time.
World Wide Web contains 170 Terabytes of information [1] and storage estimates show that the new information is growing at a rate of over 30% a year. With the quanta of information growing exponentially, it is important to understand the information semantically to know what concepts are relevant and what are irrelevant. The Evolutionary Behavior Of Textual Semantics (EBOTS) system being developed at University of Arkansas at Little Rock [2] aims at the quantitative reasoning aspect of textual information. In the automatic decision-making mode, the EBOTS system can distinguish between relevant and irrelevant information, discarding irrelevant documents and accepting only relevant information to develop expertise in a particular field. This paper discusses the usefulness of Information Theory in the development of relevance criteria and the results obtained in the context of textual information.
The quality of an architectural design of a software system has a great influence on achieving non-functional requirements to the system. Unified Modeling Language (UML), which is the industry standard as a common object oriented modeling language needs a well-defined semantic base for its notation. Integrating formal methods Petri nets (PNs) with object oriented design concepts UML is useful to benefit from the strengths of both approaches. Formalization of the graphical notation enables automated processing and analysis tasks. In this paper we use a method to converting State Diagram to Generalized Stochastic Petri Net (GSPN) and then we derive the embedded Continues Time Markov Chain from the GSPN and finally we use Markov Chain theory to obtain performance parameters.
This paper describes the design, development and deployment challenges facing an implementation of an enterprise-wide distributed web-based Planning, Budgeting and Reporting Control Management Information System for a large public utility organization. The system serves the needs of all departments of the company’s General Division of Production. The departments of the division are situated all over the Greek state with many geographically remote plants under the control of the division. To speed-up the exchange of Management Information between the various levels of the hierarchy regarding daily, monthly, or longer-term reports on the operational level, a portal was set-up that enabled all levels of management personnel to have controlled access to constantly updated information about operations, strategic goals and actions. A new planning and budgeting system for controlling operational, investment, and personnel expenses based on the Activity-Based Costing (ABC) & Budgeting model was then integrated into the portal to provide a web-based Planning Budgeting & Reporting Control MIS. The system is capable of handling many thousands of requests per hour for internal reports, graphs, set goals etc. and allows the seamless collaboration and coordination between all departments in the organizational hierarchy.
Since software must evolve to meet the typically changing requirements, source code modifications can not be avoided. Impact analysis is one of the central and relatively demanding tasks of software maintenance. It is constantly needed while aiming at ensuring the correctness of the made modifications. Due to its importance and challenging nature automated support techniques are required. Theoretically, forward slicing is a very suitable technique for that purpose. Therefore, we have implemented a program dependence graph (PDG) based tool, called GRACE, for it. For example, due to the typical rewritings of Visual Basic programs there is a great need to support their impact analysis. However, there were neither earlier scientific studies on slicing Visual Basic nor reported slicers for it. In case of forward slicing there is a need to perform efficient static slicing revealing all the potential effects of considered source code modifications. Use of PDGs helps in achieving this goal. Therefore, this paper focuses on describing automated PDG-based forward slicing for impact analysis support of Visual Basic programs. GRACE contains a parser, a PDG-generator and all other necessary components to support forward slicing. Our experiences on the application of the PDG-based forward slicing has confirmed the feasibility of the approach in this context. GRACE is also compared to other forward slicing tools.
Recent work has begun to investigate the advantages of using reversible logic for the design of circuits. The majority of work, however, has limited itself to combinational logic. Researchers are just now beginning to suggest possibilities for sequential implementations. This paper performs a closer analysis of three latch designs proposed in previous work and suggests advantages and disadvantages of each
Many systems exist that store and manipulate data; however, many do no have sufficient support for spatial data Many data structures are proposed that are intended specifically for spatial data; however, software implementations have not performed as well as hoped. This work presents a feasibility study investigating the use of a FPGA for the implementation of a structure to support spatial search and retrieval.
We first analyze security targets of implementing security management for nowadays IT infrastructures – information systems created by enterprises for successful business, and detail possible measures for achieving relevant targets. Secondly, we conclude that the essentials of security management are to construct trustworthy network endpoints, and to establish trustworthy communication channel between intending communication parties; then two instances of accomplishing the essentials of security management are exemplified, i.e. trustworthy smart card transaction and trustworthy SOA-Based Web Services. At last, we discuss the main aspects of implementing security management for information systems, precisely, strategic steps, i.e. (1) attestation and negotiation, (2) proposing and implementing application-specific strategies, and (3) considerations for strength and efficiency of security management.
Fuzzy Set Ordination (FSO) and Fuzzy C-means Classification techniques were used to study the relationships between plant communities and environmental factors in Pangquangou Nature Reserve, Shanxi province of China. Pangquangou Nature Reserve, located at N37°20’-38°20’, E110°18’-111°18’, is a part of Luliang mountain range. Eighty-nine quadrats of 10m x 20m along an elevation gradient were set up and recorded in this area. The results showed that the two methods, FSO and fuzzy C-means classification describe the ecological relations of communities successfully. The results of FSO showed that the distribution of communities is closely related to elevation, water-conditions and humidity, and also related to aspect and slope. Thirteen community types were distinguished by fuzzy C-means classification, and each of them has special characteristics. The combination of FSO and fuzzy C-means classification may be more effective in the studies of community ecology.
Imagine that intruders are in a dark polygonal room and move at a finite but unbounded speed, trying to avoid detection. Polygon search problem asks whether a polygon is searchable, i.e., no matter how intruders move, searcher(s) can always detect them. A polygon is LR-visible if there exist two boundary points such that the two polygonal chains divided by them are mutually weakly visible. We explore the relationship between the searchability and LR-visibility of a polygon. Our result can be used as a preprocessing step in designing algorithms related to polygon search.
The continuous growth of OLAP users and data impose additional stress on data management and hardware infrastructure. The distribution of multidimensional data through a number of servers allows the increasing of storage and processing power without an exponential increase of financial costs. But this solution adds another dimension to the problem: space. Even in centralized OLAP, cube selection efficiency is complex, but now, we must also know where to materialize subcubes. This paper proposes algorithms that solve the distributed OLAP selection problem under space constraints, considering a query profile, using discrete particle swarm optimization in its normal, cooperative, multi-phase and hybrid genetic versions.
Recently, Peer-to-Peer (P2P) architecture is being used to explore better the computing power and bandwidth of networks than a client/server architecture. In order to support the creation of P2P applications, some frameworks were proposed such as JXTA. However, large systems using P2P architecture are complex to be developed, maintained and evolved. Model Driven Architecture (MDA) can support the management of the complexity in the software development process through transformations of Platform Independent Models (PIM) into Platform Specific Models (PSM). In this paper, we apply an MDA approach to allow the development of applications based on JXTA. The JXTA implementation in Java is used to demonstrate our approach. We propose a model transformation definition from an UML model to a Java+JXTA model. In order to validate our approach, we present two case studies.
Automatic code generation of application families emerges as a solid promise to cope with the increasing demand of software in business environments. Using templates and metadata for development of abstract solutions and further automatic generation of the particular cases, helps freeing the developers from the most mechanical and tedious tasks of the implementation phase, allowing them to focus their knowledge in the expression of conceptual solutions. In this case study, we adapted the Halstead metrics for object-oriented code, templates, and metadata -in XML format- to measure the effort required to specify and then automatically generate complete applications, in comparison with the effort required to build the same applications entirely by hand. Then we used the same metrics to compare the effort of specifying and generating a second application of the same family, versus the effort required to coding this second application by hand.
Software quality has become increasingly important as a crucial factor in keeping organizations competitive. Software process measurement is an essential activity in achieving better quality and guarantees, both in the development process and in the final product. This paper presents the use of multi-criteria in a proposed model for the software measurement process, in order to make it possible to perform organizational planning for measurement, prioritize organizational metrics and define minimal acceptance percentage levels for each metric. This measurement process was based on five well known processes of measurement: CMMI-SW, ISO/IEC 15939, IEEE Std 1061, Six Sigma and PSM.
Software development is a complex activity which demands a series of factors to be controlled. In order for this to be controlled in an effective manner by project management, it is necessary to use software process measurement to identify problems and to consider improvements. This paper presents an organizational software measurement process resulting from the mapping of five relevant software measurement processes: CMMI-SW, ISO/IEC 15939, IEEE Std 1061, Six Sigma, and PSM (Practical Software Measurement). The best practices of each one were used, including relevant keys to facilitate the applicability of a measurement process focused on project management, as well as assuring the software quality.
Currently, to obtain maximum fidelity 3D audio, an intended listener is required to undergo time consuming measurements using highly specialized and expensive equipment. Customizable Head-Related Impulse Responses (HRIRs) would remove this limitation. This paper reports our progress in the first stage of the development of customizable HRIRs. Our approach is to develop compact functional models that could be equivalent to empirically measured HRIRs but require a much smaller number of parameters, which could eventually be derived from the anatomical characteristics of a prospective listener. For this first step, HRIRs must be decomposed into multiple delayed and scaled damped sinusoids which, in turn, reveal the parameters (delay and magnitude) necessary to create an instance of the structural model equivalent to the HRIR under analysis. Previously this type of HRIR decomposition has been accomplished through an exhaustive search of the model parameters. A new method that approaches the decomposition simultaneously in the frequency (Z) and time domains is reported here.
This study presents the development of an interface for the management of diachronic spatial data that describe the evolution of an area. Subsequent data that represent specific spatial characteristics for various time periods are organized and processed in a customized GIS environment. Vector and raster data (old scanned maps, air photos and satellite imagery) are related based on their spatial and temporal properties and they are archived adequately. Part of the data set contains digital documentation in form of digital photos, audio and video files, thus the customization includes multimedia playback for selected geographic features that are described by these means. As a case study is used an extended area in Northern Greece that includes various archaeological sites along Egnatia road (nearby ancient Via Egnatia).
The main focus of this paper concerns the measuring similarity in a content-based information retrieval and intelligent question-answering environment. While the measure of semantic similarity between concepts based on hierarchy in ontology is well studied, the measure of semantic similarity in an arbitrary ontology is still an open problem. In this paper we define a fuzzy semantic similarity measure based on information theory that exploits both the hierarchical and non-hierarchical structure in ontology. Our work can be generalized the following: firstly each concept is defined as a semantic extended fuzzy set along its semantic paths; secondly the semantic similarity between two concepts is computed with two semantic extended fuzzy sets instead of two concepts themselves. Our fuzzy measure considers some factors synthetically such as ontological semantic relation density, semantic relation depth and different semantic relations, which can affect the value of similarity. Compared with existed measures, this fuzzy similarity measure based on shared information content could reflect latent semantic relation of concepts better than ever.
Chemically Assembled Electronic Nanotechnology (CAEN) using bottom-up approach for digital circuit design has imposed new dimensions for miniaturization of electronic devices. Crossbar structures or Nanofabrics using silicon nanowires and carbon nanotubes are the proposed building blocks for CAEN, sizing less than 20 nm, allowing at least 1010 gates/cm2. Along with the decrease in size, defect rates in the above architectures increase rapidly, demanding for an entirely different paradigm for increasing yields, viz. greater defect tolerance, because the defect rates can be as high as 13% or more. In this paper, we propose a non-probabilistic approach for defect tolerance and evaluate it in terms of its coverage for different sizes of fabric and different defect rates.
This papers aims at the power control aspect of Resource allocation for wireless data via employing microeconomics concepts of ulility and pricing in relation to the noncooperative game theory and Nash Equilibrium. Specifically, an efficient algorithm based on stochastic gradient formulation is proposed to adaptively converge to arrive at optimal power set with higher utilities with the pricing factor is a parameter. Both single cell and multi-cell are presented in this paper. Comparative numerical and graphical results provided attest to practical.
mArachna is a technical framework designed for the extraction of mathematical knowledge from natural language texts. mArachna avoids the problems typically encountered in automated-reasoning based approaches through the use of natural language processing techniques taking advantage of the strict formalized language characterizing mathematical texts. Mathematical texts possess a strict internal structuring and can be separated into text elements (entities) such as definitions, theorems etc. These entities are the principal carriers of mathematical information. In addition, Entities show a characteristic coupling between the presented information and their internal linguistic structure, well suited for natural language processing techniques. Taking advantage of this structure, mArachna extracts mathematical relations from texts and integrates them into a knowledge base. Identifying sub elements within new elements of information with already stored mathematical concepts defines the structure of the knowledge base. As a result, mArachna generates an ontology of the analyzed mathematical texts. In response to user queries, parts of the knowledge base are visualized using OWL. In particular, mArachna aims to provide an overview of single fields of mathematics, as well as showing intra-field relations between mathematical objects and concepts. The following paper gives an overview of the theoretical basis and the technologies applied within the mArachna framework.
Today, many institutions and organizations are facing serious problem due to the tremendously increasing size of documents, and this problem is further triggering the storage and retrieval problems due to the continuously growing space and efficiency requirements. This problem is becoming more complex with time and the increase in the size and number of documents in an organization. Therefore, there is a growing demand to address this problem. This demand and challenge can be met by developing a process to enable specialized document imaging people to select the most suitable image type and scanning resolution to use when there is a need for storing documents images. This process, if applied, attempts to solve the problem of the image storage type and size to some extent. In this paper, we present a process to optimize the selection of the scanned image type and resolution to use prior to acquire the document image which we want to store and hence to retrieve; therefore, we optimize the document image storage size and retrieval time.
In this paper, we talk about developing a search engine and information retrieval system for Bangla. Current work done in this area assumes the use of a particular type of encoding or the availability of particular facilities for the user. We wanted to come up with an implementation that did not require any special features or optimizations in the user end, and would perform just as well in all situations. For this purpose, we picked two case studies to work on in our effort to finding a suitable solution to the problem. While working on these cases, we encountered several problems and had to find our way around these problems. We had to pick and choose from a set of software packages for the one that would best serve our needs. We also had to take into consideration user convenience in using our system, for which we had to keep in mind the diverse demographics of people that might have need for such a system. Finally, we came up with the system, with all the desired features. Some possible future developments also came into mind in the course of our work, which are also mentioned in this paper.
Echo hiding is one of the prevailing techniques in audio watermarking due to its good perceptual quality. However, the detection ratio of this method is relatively low and its robustness against many common signal-processing operations is not satisfactory. In this paper, an improved watermarking extraction algorithm, which is based on auto-power-cepstrum, is proposed. Computer simulation results prove that the new method achieves higher detection ratio when compared with conventional auto-complex-cepstrum based algorithm and its robustness against various signal processing manipulations, such as Mp3 compression, re-sampling, cropping, re-quantization, filtering, amplitude amplifying, noise addition and time delay, is great.
It is believed that trust will be the primary mental force in the electronic environment as it is in the current physical environment. At the core trust is impacted by users’ propensity to trust (internal mental state), reliance on the trustee and external direct and indirect factors. Decentralization of publication is one of the great advantages of the internet infrastructure. Web 2.0 applications aim to promote and assist online users to publish and contribute freely for Collective Intelligence. They resolve around the notion that people can add contents for Collective Intelligence and enterprises can also use the contents to reduce costs and increase profits through Social Network Analysis. This paper proposes a conceptual mental trust recognition and evaluation model and meta-document structure to represent, distribute and store users trust evaluations. The proposed document design is based on decentralized information structure that semantically represents contributed contents and the contributor. The contents are represented and distributed by using Atom, Resource Description Framework (RDF) and RDF Schema. The proposed meta-document structure uses RDF Schema to semantically represent users’ internal (inner) online trust evaluation model within Web 2.0 environment. It can be used as a blueprint to develop new vocabularies for any e-domain. However, trust recognition model is selected due to its importance in electronic environment.
This paper presents a quorum-based distributed algorithm for the group mutual exclusion. In the group mutual exclusion problem, multiples processes can enter a critical section simultaneously if they belong to the same group. This algorithm assumes that only one session can be opened at any time, several processes can access to the same session, and any requested session can be opened in a finite time. The message complexity of this algorithm is O(□n ) for the finite projective plane of order 2 (Fano plane) and O(2□n -1) for a grid where n is the total number of processes.
The Java security package allows a programmer to add security features to Java applications. Although the package provides a complex application programming interface (API), its informal description, e.g., Javadoc comments, is often ambiguous or imprecise. Nonetheless, the security of an application can be compromised if the package is used without a concrete understanding of the precise behavior of the API classes and interfaces, which can be attained via formal specification. In this paper, we present our experiences in formally specifying the Java security package in JML, a formal behavior interface specification language for Java. We illustrate portions of our JML specifications and discuss the lessons that we learned, from this specification effort, about specification patterns and the effectiveness of JML. Our specifications are not only a precise document for the API but also provide a foundation for formally reasoning and verifying the security aspects of applications. We believe that our specification techniques and patterns can be used to specify other Java packages and frameworks.
In this new computing age of high complexity, a common weakness in the interoperability between business and IT leaves IT far behind the direction business is taking; poor business responsiveness and IT governance makes it even harder to achieve the enterprise goal. To cope with this common issue, we introduce the enterprise interoperability to integrate the metadata between business, service and information layers, this create visibility of vertical alignment within enterprise architecture and use metadata configuration to construct the mapping between each layer.
Enterprise Architecture has been in center of attention in late 90s as a comprehensive and leading solution regarding the development and maintenance of information systems. An enterprise is considered a set of elaborate physical and logical processes in which information flow plays a crucial role. The term Enterprise Architecture encompasses a collection of different views within the enterprise which constitute a comprehensive overview when put together. Such an overview can not be organized regardless of incorporating a logical structure called Enterprise Architecture Framework. Among various proposed frameworks, the Zachman Framework (ZF) is one of the most prominent ways of conceptualization. The main problem faced in using ZF is the lack of coherent and consistent models for its cells. Several distinctive solutions have been proposed in order to eliminate the problem, however achieving no success in thoroughly covering all the cells in ZF. In this paper, we proposed an integrated language based on Model Driven Architecture (MDA) in order to obtain compatible models for all cells in ZF. The proposed method was examined in practice, revealing its advantages and the efficiency gained in comparison to previously studied techniques.
Shopping around for a good service provider in a Grid Computing environment is no less challenging than the traditional shopping around in non-virtual marketplace. A client may consult a service broker for providers that can meet specific QoS requirements (e.g., CPU speed), and the broker may return a list of candidate providers that satisfy the client's demands. If this computing platform is backed up by some reputation system, the list of providers is then sorted based on some reputation criterion, which is commonly the user rating. We argue in this paper that judging the reputation of a provider based on user rating is not sufficient. The reputation should additionally reflect how trustworthy that provider has been with respect to complying with the finalized SLA (using a metric called conformance) and how consistent it has been with respect to honouring its compliance levels (using a metric called fidelity). Accordingly, we perceive the reputation as a vector of three dimensions: user rating, conformance, and fidelity. In this paper, we define these metrics, explain how to compute them formally, and how to use them in the reputation-enabled framework that we describe.
Email has become one of the fastest and most economical forms of communication. However, the increase of email users have resulted in the dramatic increase of spam emails during the past few years. In this paper, email data was classified using four different classifiers (Neural Network, SVM classifier, Naïve Bayesian Classifier, and J48 classifier). The experiment was performed based on different data size and different feature size. The final classification result should be ‘1’ if it is finally spam, otherwise, it should be ‘0’. This paper shows that simple J48 classifier which make a binary tree, could be efficient for the dataset which could be classified as binary tree.
Noise reduction is essential to achieve an acceptable QoS in VoIP systems. This paper proposes a Wiener filter-based noise reduction scheme optimized to the estimated SNR at each frequency bin as a logistic function is used. The proposed noise reduction scheme would be applied as pre-processing before speech encoding. For various noisy conditions, the PESQ evaluation is performed to evaluate the performance of the proposed method. In this paper, G.711, G.723.1, and G.729A are used as test VoIP speech codecs. The PESQ results show that the performance of our proposed noise reduction scheme outperforms those of the noise suppression one in the IS-127 EVRC and the noise reduction one in the ETSI standard for the advanced distributed speech recognition front-end.
This paper constructs a high-level Abstract State Machine (ASM) model of our conceptual software architecture for “living” cooperative information systems founded in living systems theory. For practical execution, we use AsmL, the Abstract state machine Language developed at Microsoft Research and integrated with Visual Studio, to refine the ASM model to an executable system model for evaluation.
Solving crimes is a complex task and requires a lot of experience. Data mining can be used to model crime detection problems. The idea here is to try to capture years of human experience into computer models via data mining. Crimes are a social nuisance and cost our society dearly in several ways. Any research that can help in solving crimes faster will pay for itself. According to Los Angeles Police Department, about 10% of the criminals commit about 50% of the crimes. Here we look at use of clustering algorithm for a data mining approach to help detect the crimes patterns and speed up the process of solving crime. We will look at k-means clustering with some enhancements to aid in the process of identification of crime patterns. We applied these techniques to real crime data from a sheriff’s office and validated our results. We also used semi-supervised learning technique here for knowledge discovery from the crime records and to help increase the predictive accuracy. Our major contribution is the development of a weighting scheme for attributes, to deal with limitations of various out of the box clustering tools and techniques. This easy to implement data mining framework works with the geo-spatial plot of crime and helps to improve the productivity of the detectives and other law enforcement officers. It can also be applied for counter terrorism for homeland security.
This paper introduces a new hill-climbing operator, (MGAC), for GA optimization of combinatorial problems, and proposes two implementation techniques for it. The MGAC operator uses a small size second-level GA with a small population that evolves for a few generations and serves as the engine for finding better solutions in the neighborhood of the ones produced by the main GA. The two implementations are tested on a Power Systems' problem called the Unit Commitment Problem, and compared with three other methods: a GA with classic hill-climbers, Lagrangian-Relaxation, and Dynamic Programming. The results show the superiority of the proposed MGAC operator.
web browsing has become extremely important in every field of life whether it is education, business or entertainment. With a simple mouse click, user navigates through a number of web pages. This immediacy of traversing information links make it difficult to maintain an intuitive sense of where one is, and how one got there. A zooming browser is designed in java to explore alternate paradigm for navigating the www. Instead of having a single page visible at a time, multiple pages and the links between them are depicted on a large zoomable information surface. Links are shown in hierarchy so that user can see the relationship of web pages with their parent and child nodes. Browser also maintains the history of links traversed.
In this Paper, we proposed a verifiable multi-authority e-voting scheme which satisfies all the requirements of large scale general elections. We used blind signature for voters’ anonymity and threshold cryptosystem for guarantee fairness of the voting process. Our scheme supports all types of election easily without increasing complexity of the scheme. Moreover, our scheme allows open objection that means a voter can complain in each stage while his privacy remains secret. Furthermore, the simplicity and low complexity of computation of the protocol makes it practical for general use.
Stochastic Simulation is today a powerful tool to foresee possible dynamics of strict subsets of the real world. In recent years, it has been successfully employed in simulating cell dynamics with the aim of discovering exogenic quantities of chemicals able to deflect typical diseased simulation paths in healthy ones. This paper gives a large overview of the stochastic simulation environment and offers an example of a possible use of it on a pathway triggered by DNA damage.
Causality assignment is an important task in physical modeling by bond graphs. Traditional causality assignment algorithms have specific aims and particular purposes. However they may fail if a bond graph has loops or contains junction causality violations. Some of the assignment algorithms focuses on the generation of differential algebraic equations to take into account junction violations caused by nonlinear multi-port devices and is not suitable for general bond graphs. In this paper, we present a formulation of the causality assignment problem as a constrained multi-objective optimization problem. Previous solution techniques to this problem include multi-objective Branch-and-Bound and Pareto archived evolution strategy – both are highly complex and time-consuming algorithms. A new solution technique called gSEMO (global Simple Evolutionary Multi-objective Optimizer) is now used to solve the causality assignment problem with very promising results.
Scheduling algorithms play an important role in design of real-time systems. Due to high processing power and low price of multiprocessors, real-time scheduling in such systems is more interesting, yet more complicated. Uniform multiprocessor platforms consist of different processors with different speed or processing capacity. In such systems the same piece of code may require different amount of time to execute upon different processing units. It has been proved that there in no optimal online scheduler for uniform parallel machines. In this paper a new fuzzy-based algorithm for scheduling soft real-time tasks on uniform multiprocessors is presented. The performance of this algorithm is then compared with that of EDF algorithm. It is shown than our proposed approach has supremacy over EDF in some aspects, since it usually results in higher success ratio, better utilizes the processors and makes a more balanced schedule.
Spatial truss structures are very popular in architecture and civil engineering. These types of structures have single structural elements with small size. Due to this, spatial trusses can be easily manufactured, transported and assembled in practice. The aim of this work is to develop a Java software package for linear simulation of spatial truss structures using the finite element method. In this program, in contrast to the node-oriented description of element quantities the element-oriented matrix notation is used and it is possible to visualize the model as well as the simulation results. The functionality of the software is demonstrated at hand of some application examples. The results and visualizations of the numerical examples are confirmed that presented finite element analysis program based on object-oriented methodology for spatial truss can be used effectively.
Conventional ATPG algorithms would fail when applied to asynchronous circuits due to the absence of a global clock and presence of more state holding elements that synchronize the control and data paths, leading to poor fault coverage. This paper presents three DFT implementations for the asynchronous NULL Conventional Logic (NCL) paradigm, with the following salient features: 1) testing with commercial DFT tools is shown to be feasible; 2) this yields a high test coverage; and 3) minimal area overhead is required. The first technique incorporates XOR gates for inserting test points; the second method uses a scan latch scheme for improving observability; and in the third scheme, scan latches are inserted in the internal gate feedback paths. The approaches have been automated, which is essential for large systems; and are fully compatible with industry standard tools.
This paper introduces ant colony system (ACS), a distributed algorithm that is applied to the Stable Marriage Problem (SM). The stable marriage problem is an extensively-studied combinatorial problem with many practical applications. It is well known that at least one stable matching exists for every stable marriage instance. However, the classical Gale-Shapley [2] algorithm produces a marriage that greatly favors the men at the expense of the women, or vice versa. In our proposed ACS, a set of cooperating agents called ants cooperate to find stable matchings such as stable matching with man-optimal, woman-optimal, egalitarian stable matching, sex-fair stable matching. So this ACS is a novel method to solve Stable Marriage Problem. Our simulation results show the effectiveness of the proposed ACS.
In this paper, the Q-Learning based univector field method is proposed for mobile robot to accomplish the obstacle avoidance and the robot orientation at the target position. Univector field method guarantees the desired posture of the robot at the target position. But it does not navigate the robot to avoid obstacles. To solve this problem, modified univector field is used and trained by Q-learning. When the robot following the field to get the desired posture collides with obstacles, univector fields at collision positions are modified according to the reinforcement of Q-learning algorithm. With this proposed navigation method, robot navigation task in a dynamically changing environment becomes easier by using double action Q-learning [8] to train univector field instead of ordinary Q-learning. Computer simulations and experimental results are carried out for an obstacle avoidance mobile robot to demonstrate the effectiveness of the proposed scheme.
Domino logic circuits have been aggressively explored for vulnerabilities due to crosstalk noise. In these circuits, statistical modeling of crosstalk noise seems to be a promising approach due to factors like: large unpredictability in crosstalk noise with technology trends pushing process variations to their extreme end and reducing feature sizes ensuing unevenness in device geometries. We present here a general model for crosstalk noise with cross-coupling capacitive variance and MOS devices’ channel width variation effects and progressively refine it to get the most accurate circuit analysis model for deriving the crosstalk distribution. The statistical model derived is validated with 1000 runs of Monte Carlo simulations.
A new task in Interactive Information Retrieval (IIR) is considered – optimization of information retrieval taking into account impact on quality of interaction with the user. Dual IIR (DIIR) is defined. An integer programming model for DIIR is given.
Report annotation: this article analyses business rules, their application to data analysis and decision making as an aid for business participants. The main focus is on application of business rules, as an example, debtor management problem arising to many marketing companies has been examined. Also, this article scrutinizes principles of business rules running, along with opportunities of their application and implemented practical employment of these rules.
Fundamental aspects of scientific research in the field of atomic physics are discussed in this paper from the point of view of information system that would cover the most important phases of research. Such information system should encompass the complexity of scientific research trying to incorporate data scattered in various books, articles, research centers, databases, etc. We started from scratch with principal analysis of basic research processes and data that represent needs and condensed research experience. Particular problem of search for data is specially discussed and the main idea for new proposed approach is described. We developed a prototype of information system to be used by researchers in various research phases. Search for data is based on the web, as it is the standard way for easy data access.
The general idea of the proposed approach is to integrate simple reactive intelligence acquired by experimentation together with planning and learning processes. The autonomous agent [1] can be considered as a representative of an intelligent entity located in the real world. It is expected that it should express rational behavior and possess the ability to learn relevant to it’s goals. The main objective of this paper is to construct a cognitive model of an agent that is capable of rational behavior in a dynamical environment. The concepts, like the goal, reactivity, and planning are investigated in the context of an agent that undertakes decisions and actions in completely or partly unknown environment. We have also proposed the integration of reactive and planning decision selection mechanisms by applying the concept of trust to its decisions on the basis of reinforcement. When designing our agent, we applied the bottom up approach, aiming to present some of the relevant research in this area. The primary advantage of this approach is shown by the improved performance of the agent during the execution of the given task. The effectiveness of the proposed solution has been initially tested in a simulated environment (evasive maneuver problem).
We present a general framework for simulating the behaviors of free feather like objects inside a dynamic changing flow field. Free feathers demonstrate beautiful dynamics, as they float, flutter, and twirl in response to lift and drag forces created by its motion relative to the flow. To simulate its movement in 2D, we adopt the thin strip model to account for the effect of gravity, lift and inertial drag. To achieve 3D animations, we implement two methods. For the first approach, we extend the thin strip model, use either flow primitive or noise functions to construct a time-varying flow field and extract external forces to update the thin strip computation. For the second approach, we implement a physically based simulation of the flow field and adopt the momentum-exchange method to evaluate the body force on the feather. As a result, the natural flutter, tumble, gyration dynamics emerge and vortices are created all in response to local surface-flow interactions without the imposition of the thin strip model.
We study the classical problem of deadlock detection for systems with n processes and d reusable resource types, where d≪n. We present a novel algorithm for the problem. The algorithm enjoys two properties. First, its cost is n/log(n) times sm aller than that of the well-known Dijkstra’s algorithm, when d=O(log(n)). Secondly, its data structures are simple and easy to maintain. In particular, the algorithm employs no graph or tree based data structures. We also derive a linear-time algorithm when d and the resource requests are bounded by constants. The linear-time algorithm is asymptotically optimal. The algorithms are applicable to improving the Banker’s algorithm for deadlock avoidance. Categories and Subject Descriptors: D.4.1 Operating Systems: Process Management; General Terms: Deadlock, algorithms, performance
A measure of machine intelligence facilitates comparing alternatives having different complexity. In this paper, a method for measuring the machine intelligence quotient (MIQ) of human-machine cooperative systems is adapted and applied to measure the MIQ of an agent-based distributed sensor network system. Results comparing the MIQ of different agent-based scenarios are presented for the distributed sensor network application. The MIQ comparison is contrasted with the average sensor network field life, a key performance indicator, achieved with each scenario in Monte Carlo simulations.
In this work, silo discharge was viewed as a complex fluid flow, in order to perfect a new technique for the measurement of flow rate. Flow rate was investigated using a non intrusive method measuring the evolution of the free surface profile during the discharge flow. This method consisted of recording via a CCD sensor, the evolution of the free surface by laser planes, and then obtaining by processing the free surface position and shape over time.
Robustness is the important issue in watermarking, robustness at the same time with blind watermark recovering algorithm remains especially challenging. This paper presents a combined DWT and DCT still image blind watermarking algorithm. The two-level DWT are performed on the original image, the low-frequency sub-band is divided into blocks, the DCT is performed on the every block, the DCT coefficients of every block are sorted using Zig-Zag order, the DCT lowfrequency coefficient is selected as embedding watermarking. The watermarking signals are embedded into the selected embedding points using the modular arithmetic. The watermark recovering is the inverse process of the watermark embedding, according to the answer of the modular arithmetic, we can estimate the value of embedded the watermark. The algorithm is compared with a pure DWT-based scheme Experiment results shown that the proposed algorithm is robust to many attacks such as JPEG compression, addition noise, cropping, JPEG compression, median filter, rotation, and resize etc. Proposed algorithm is shown to provide good results in term of image imperceptibility, too.
With the increasing rate of myocardial infarction (MI) in men and women, it is important to develop a diagnosis tool to determine the effect of MI on the mechanics of the heart and to minimize the effect of heart muscle damage on overall cardiac performance. After a myocardial infarct, the left ventricle of the heart enlarges to compensate for a weak heart muscle. The enlarged and weakened heart gives rise to the clinical syndrome of heart failure. In order to maximize the mechanical performance of the weakened heart, regional ventricular loading and contraction must be understood. To isolate regional wall mechanics, a floating centroid for the left ventricle must to be calculated. This is easy in the normal heart where the left ventricle approximates a single radius of curvature; however in heart failure there are irregular shape changes that complicate this calculation. The conventional method used for centroid calculation employs a center of mass (COM) determination of the whole left ventricle. This method has many shortcomings when applied to an enlarged and irregular left ventricle. This paper proposes a new algorithm for centroid calculation based on iterative majorization to locate the centroid.