Journal of Research and Practice in Information Technology

Published by Australian Computer Society Inc
Document Logic is a simple yet powerful framework to infer risks in business processes. We focus on flows of documents and build a set of inference rules based on document authenticity and a simple trust model. We have built a prototype of a system that checks document authenticity in Maude. Maude is an implementation of rewriting logic. Rewriting logic is expressive and general enough to define other specialized logics, like Document Logic. In our framework, a business process is modeled as a transition system. Our prototype takes a business process and an undesired situation as its input and outputs all the possible risks in the business process.
Performance estimation of processor is important to select the right processor for an application. Poorly chosen processors can either under perform very badly or over perform but with high cost. Most previous work on performance estimation are based on generating the development tools, i.e., compilers, assemblers etc from a processor description file and then additionally generating an instruction set simulator to get the performance. In this work we present a simpler strategy for performance estimation. We propose an estimation technique based on the intermediate format of an application. The estimation process does not require the generation of all the development tools as in the prevalent methods. As a result our method is not only cheaper but also faster.
This work shows that the cryptanalysis of the shrinking generator requires fewer intercepted bits than what indicated by the linear complexity. Indeed, whereas the linear complexity of shrunken sequences is between $A \cdot 2^(S-2)$ and $A \cdot 2^(S-1)$, we claim that the initial states of both component registers are easily computed with less than $A \cdot S$ shrunken bits. Such a result is proven thanks to the definition of shrunken sequences as interleaved sequences. Consequently, it is conjectured that this statement can be extended to all interleaved sequences. Furthermore, this paper confirms that certain bits of the interleaved sequences have a greater strategic importance than others, which may be considered as a proof of weakness of interleaved generators.
In this paper, we develop a new cellular automata-based linear model for several nonlinear pseudorandom number generators with practical applications in symmetric cryptography. Such a model generates all the solutions of linear binary difference equations as well as many of these solutions are pseudo-random keystream sequences. In this way, a linear structure based on cellular automata may be used to generate not only difference equation solutions but also cryptographic sequences. The proposed model is very simple since it is based exclusively on successive concatenations of a basic linear automaton.
Comparison of direct resource access to virtual resource access  
The continuous & autonomous workflow of agent by DTP
DTP planing: objectives matching with reusable entries in OM B Each entry of OMB has four components, an entry name with parameters, Process, PreProcess and PostProcess. Process is executed after the execution of PreProcess and before the execution of PostProcess.
Mobile Agent Template (MAT) is a mobile agent system that is under study and development at the Institute of Computing Technology, Chinese Academy of Sciences and sponsored by the University of Wollongong, Australia. MAT is not an alternative to other mobile agent systems, but is an agent system that can provide the autonomy to mobile agents. MAT tries to support new Web applications, such as the mobile computation, by autonomous and mobile agents. Mobile Thread Programming Model (MTPM), Distributed Task Plan (DTP) and Active State Space (ASS) are integral components on which MAT is constructed. Integration of these three components provides agents with an autonomous work mode and an autonomysupporting execution environment. In this paper, we define autonomies of agents in the context of mobility and propose our autonomous theories, which are autonomous workflow, asynchronous and localized interactions, and a virtual supporting environment. This paper also outlines current implementation mechanisms of MAT including architecture, program paradigm, distributed task planning and communications. The main contributions of this research are that: (1) workflows are adopted as agents' working modes; (2) a goal-directed and dynamic task planning is used to deal with the heterogeneity and dynamism of networks; and (3) a virtually platform-independent environment is constructed to provide mobile agents with asynchronous, anonymous and fully localized interactions. The innovation of this research is to provide a new solution for novel Web applications such as mobile computations by using MAT.
Many existing constructive decision tree learning algorithms such as Fringe and Citre construct conjunctions or disjunctions directly from paths of decision trees. This paper investigates a novel attribute construction method for decision tree learning. It creates conjunctions from production rules that are transformed from decision trees. Irrelevant or unimportant conditions are eliminated when paths are transformed into production rules. Therefore, this new method is likely to construct new attributes with relevant conditions. Three constructive induction algorithms based on this basic idea are described and are empirically evaluated by comparing with C4.5 and a Fringe-like algorithm in a set of artificial and natural domains. The experimental results reveal that constructing conjunctions using production rules can significantly improve the performance of decision tree learning in the majority of the domains tested in terms of both higher prediction accuracy and lower theory complexity. These results suggest an advantage of the attribute construction method that uses production rules over the method of constructing new attributes directly from paths in noisy domains.
Parameters used for each filter type. 
Recognition accuracy (%) of acoustic neural networks. 
Recognition accuracy (%) of visual neural networks. 
Automatic speech recognition (ASR) performs well under re- stricted conditions, but performance degrades in noisy envi- ronments. Audio-Visual Speech Recognition (AVSR) combats this by incorporating a visual signal into the recognition. This paper briefly reviews the contribution of psycholinguistics to this endeavour and the recent advances in machine AVSR. An important first step in AVSR is that of feature extraction from the mouth region and a technique developed by the authors is breifiy presented. This paper examines examine how useful this extraction technique in combination with several integration arhitectures is at the given task, demonstrates that vision does infact assist speech recognition when used in a linguistically guided fashion, and gives insight remaining issues.
METEOR system architecture  
Eligibility Referral Workflow
Schematic view of the Immunization Tracking application
Implementation test-bed for the Immunization Tracking application  
Healthcare enterprises involve complex processes that span diverse groups and organizations. These processes involve clinical and administrative tasks, large volumes of data, and large numbers of patients and personnel. The tasks can be performed either by humans or by automated systems. In the latter case, the tasks are supported by a variety of software applications and information systems which are very often heterogeneous, autonomous, and distributed. The development of systems to manage and automate these processes has increasingly played an important role in improving the efficiency of healthcare enterprises.
High level view of our SECDW profile  
New stereotypes  
At present, it is very difficult to develop a methodology that fulfills all criteria and comprises all security constraints in the successful design of data warehouses. If that methodology were developed, its complexity would hinder its success. The solution, therefore, would be an approach in which techniques and models defined by the most accepted model standards were extended by integrating the necessary security aspects that at this moment in time are not covered by the existing methodologies. In this paper, we will focus on solving confidentiality problems in the conceptual modelling of data warehouses by defining a profile using the UML 2.0 extensibility mechanisms. In addition, we define an OCL extension that allows us to specify the security constraints of the elements in conceptual modelling of data warehouses and we apply this profile to an example. Keywords: Secure Data Warehouse, UML profile, OCL, security, confidentiality. ACM Classification : D2.2 (Design Tools and Techniques), K6.5 (Security and Protection)
With the explosive growth of the Internet, electronic-commerce (e-commerce) is an increasingly important segment of commercial activities on the web. The Secure Agent Fabrication, Evolution and Roaming (SAFER) architecture was proposed to further facilitate e-commerce using agent technology. In this paper, the electronic payment aspect of SAFER will be explored. The Secure Electronic Transaction (SET) protocol and E-cash were selected as the bases for the electronic payment system implementation. The various modules of the payment system and how they interface with each other are shown. An extensible implementation using JavaTM will also be elaborated. This application incorporates agent roaming functionality and the ability to conduct e-commerce transactions and carry out intelligent e-payment procedures.
shows the parameters and design choices.
This article was written to commemorate the Basser Computing Laboratory's early contributions to communications technology by examination of an underused modern technology. It commences with a brief tutorial on spread-spectrum communications, together with a summary of the salient characteristics of the technology. Outdoor and non-mobile uses of Wireless LAN NICs are then introduced with two case studies. In conclusion, the policy implications of spread-spectrum technology for broadband Internet access in regional Australia are examined, together with an analysis of some opportunities for Australian entrepreneurs.
Life Cycle of an eCo  
Referral to a Specialist – Transfer Protocol  
Viewing a Health Data Item  
Consent view  
Granting Access to the Receiving Provider  
This paper describes an eConsent model and demonstrator used to investigate the implementation of patient consent as a means of controlling access to electronic health information shared between healthcare providers. The model and demonstrator described here are designed to operate in an environment of independent cooperating healthcare facilities, such as medical clinics and hospitals, where each facility is responsible for controlling access to the health information in its keeping, according to the patient's expressed conditions as recorded and held by the facility. Novel, privacy-preserving transfer protocols are used to ensure that access to the health information at the receiving facility continues to be governed by the patient's consent. The work was well-received at a symposium where a wide range of stakeholders were offered an opportunity to consider the clinical, legal and technical feasibility of the approach represented by the demonstrator.
This paper discusses some of the major issues in information requirements determination, and argues that failure to remedy or attend to these issues may ultimately contribute to failures in information systems development projects. A key to improving the practice of information requirements determination is suggested to be in ensuring that a shared understanding between analyst(s) and key participants. Yet this concept of shared understanding is itself shown to be problematic. An action research study which demonstrated the helpfulness of cognitive mapping in achieving shared understanding is described and discussed.
Agent-based electronic commerce (e-commerce) has been booming with the development of the Internet and agent technologies. However, little effort has been devoted to exploring the learning and evolving capabilities of software agents. This paper addresses issues of evolving software agents in e-commerce applications. An agent structure with evolution features is proposed with a focus on internal hierarchical knowledge. We argue that knowledge base of agents should be the cornerstone for their evolution capabilities, and agents can enhance their knowledge bases by exchanging knowledge with other agents. In this paper, product ontology is chosen as an instance of knowledge base. We propose a new approach to facilitate ontology exchange among e-commerce agents. The ontology exchange model and its formalities are elaborated. Product-brokering agents have been designed and implemented, which accomplish the ontology exchange process from request to integration.
The IT professional unlike professionals in other disciplines does not have to abide by the strictures of a professional society. In Australia the professional IT association has a Code of Ethics that, while easily accessible, needs clarification to apply it in the real world. Though the ACS code is distinctly Australian in the way it has been formulated, it sits easily within the general tenets espoused by similar associations in other countries. Although cultural issues have influenced the moral philosophy of the ACS code, there are lessons from other countries that apply in the Australian context. Interpreting the code and applying it to one's situation can be facilitated through seeing how others have applied the code and through understanding its underlying tenets.
Nondeterminism has a central role in computer science, and poses difficulties for many formalisations of domain knowledge in artificial intelligence. In particular, the nondeterminism of actions often complicate the formal treatments of actions. This paper elucidates Borne possible sources of nondeterminism to theories of actions. The emphasis is on how nondeterminism stay arise from inadequacies of the representation language, the identification of which can suggest strategies for reducing or eliminating the apparent nondeterminism.
Reliability information and test of convergent validity 
It is generally agreed within information systems research that end-user computing (EUC) among professionals is critical to their job performance. The main assumption among IS-researchers is that software usage contributes to improved performance. This study suggests that end-user computing may influence job performance in a more comprehensive way than earlier assumed. To address this issue, a set of purposeful core activities in EUC has been identified. The influence of these EUC activities on job performance is tested in a study of 328 professionals. The results demonstrate that the activities "job-task specific computer utilisation", "non-job task specific computer utilisation" and the "providing of support to colleagues" have impact on professionals' job performance. Our findings have important implications for management of EUC and for research in the area of EUC. To that end, we offer directions for future research. ACS categories: K.6 (Management of Computing and Information Systems), K.8 (Personal Computing)
This study analysed the sender and receiver addresses of 3,417 unsolicited e-mails. Over 60.3% of unsolicited e-mails were found to have an invalid sender address and 92.8% receiver addresses did not appear in the "To" or "CC" headers. The analytical results indicated that e-mail addresses in the header could provide a cue for filtering junk e-mails. Categories and Subject Descriptors: H.3.3 [Information Search and Retrieval]: Information filtering.
Framework of SME Adoption of Innovations
The technological environment in which contemporary small- and medium-sized enterprises (SMEs) operate can only be described as dynamic. The exponential rate of technological change, characterised by perceived increases in the benefits associated with various technologies, shortening product life cycles and changing standards, provides for the SME a complex and challenging operational context. The primary aim of this research was to concentrate on those SMEs that had already adopted technology in order to identify their needs for the new mobile data technologies (MDT), the mobile Internet. The research design utilised a mixed approach whereby both qualitative and quantitative data was collected to address the question. Overall, the needs of these SMEs for MDT can be conceptualised into three areas where the technology will assist business practices; communication, ecommerce and security.
The level of usage of modern e-money systems in Australia remains low, despite potential benefits and widespread use internationally. This study investigated the characteristics of modern Australian e-money products perceived as most problematic by Australian merchants. Forty-one merchants accepting e-money online and 41 merchants accepting alternative online payments methods identified which of a series of product characteristics would require most improvement before either initial adoption or more prominent usage would be undertaken. It was found that merchants using e-money products primarily required a higher level of consumer participation and lower price, but were relatively satisfied with the levels of usability and number of features offered. In contrast, merchants without any experience using e-money systems distrusted them, and required more information about the products and their features before they made a decision to adopt. The study lends support to the 'bundle of goods' view rather than the pure price or 'rational consumer' theory as an explanation for e-money adoption behaviour.
The codebook of 32 regular shaped 64-pixel patterns, defined in 16×16 blocks, where the shaded region represents 1 (motion) and white region represents 0 (no motion).  
Example Macroblocks with Quadrants (a) (b) (c)  
(a) Miss America frame #2, (b)–(d) Reconstructed frames using the H.264, Fixed-8, and EVPS(8, 75%) algorithms respectively, (e)–(g) Frame differences (×6) of (b), (c), and (d) respectively with respect to (a)  
(a) Suzie frame #2, (b)-(d) Reconstructed frames using the H.264, Fixed-8, and EVPS(8, 75%) algorithms respectively, (e)-(g) Frame differences (×6) of (b), (c), and (d) respectively with respect to (a)
In the context of very low bit-rate video coding, pre-defined fixed pattern representations of moving regions in blocked-based motion estimation and compensation has become increasingly attractive over H.264 as the former represents an MB by a smaller size moving region covered by the best available pattern that approximates the shape of the region more closely and hence, requiring no extra motion vector, which is not the case with the latter. But fixed set patterns sometimes fail to code efficiently for all video sequences. In this paper a novel idea of selecting a subset of best-matched patterns through preferential selection technique is developed by presenting two algorithms, Variable Pattern Selection (VPS) and Extended VPS (EVPS) for an initial pattern codebook size of 32 using a new parametric macroblock classification definition and a new similarity metric. The complexity analysis confirmed that EVPS guaranteed to be nearly 6 times faster than VPS, with the peak performance providing an improvement factor of 9 times. The overall performance of EVPS is identical to VPS for certain parameters but on average, 0.2dB and 0.8dB better than the contemporary algorithm using fixed set patterns and Advanced Video coding standard (H.264) respectively, for the same number of bits per frame.
Data collected for the larger study of Consumer/Doctor decision making surrounding adverse drug reactions and prescribing.  
This paper presents findings from case studies of health consumers who each suspect they may have experienced an adverse drug reaction (ADR). These case studies are part of a larger study involving consumer/doctor decisions surrounding suspected adverse drug reactions and pre- scribing. Decision support to assist with the diagnosis and management of ADRs has, to date, primarily focused on providing in-time information to prescribers about factors that pertain to the consumer and the medications they are taking. Decision support that includes consumers usually targets treatment decisions. The results of this paper indicate the prescriber is only one decision contributor in a rich tapestry of decision contributors and decision types, and consumer decision types are significantly broader than treatment decisions. The results provide guidance for the development of decision support within this domain. ACM Classification: J.3 (Life and Medical Sciences), K.4 (Computers and Society)
Percentages of reviewed papers by article type.
Percentages of reviewed papers by application field.
Percentages of reviewed papers by article type.
Percentages of reviewed papers by article type
In this paper we present a comprehensive study of mobile agent applications. We classify the application fields as follows: Network monitoring and management, information searching and filtering, multimedia, Internet, intrusion detection, telecommunications, military, and others. We discuss the potential uses of mobile agents in the various fields and present the many systems and architectures that have been proposed and implemented. Furthermore, we describe ongoing efforts to integrate currently implemented technologies with mobile agent technology. For each of the application fields, we list statistics showing the distribution of research output according to certain criteria such as article type and application field. We end each section with a summary of the work done and provide directions for future work. Finally, we conclude with suggestions about promising research areas involving mobile agents.
Design of user interaction of web-based agent systems necessitates new approaches in relation to control, task allocation, transparency and user's privacy protection. This paper investigates interaction of users with multiple agents with special focus on web-based learning systems. A proposed new architecture is described, which allows for adaptive agents' participation in the educational process, while maintaining the user as the principal locus of control in user-system interaction. The issue of user modelling, the characteristics of the conceptual model of the user and the implications of the heterogeneity of resources are also discussed in the frame of an open web-based learning environment.
An Overall Long Transaction with Local Transactions and ACID Transactions
E-Move Compensation Diagram
BPMN Compensation Handling and Transactions
This paper describes extensions to a Behavioural Description Language (BDL), which was originally proposed to characterize concurrent behaviour of simple objects and a group of objects. One of the novelties of this paper is its application to the field of E-Commerce transaction systems. Based on the BDL, we propose new concepts, namely, transaction patterns and transaction architectures, which have event-based semantics to describe large-scale transaction systems. Furthermore, the transaction architecture is introduced as a unified medium for specifying and verifying distributed, heterogeneous and complex E-Commerce transaction processes. It is also illustrated as a powerful modeling technique which is easy-to-use, flexible and promotes reusability.
Bit-counting refers to the operation of counting the number of "1" s in a given computer word or binary vector. Currently there are several algorithms to solve this problem. The simplest one is serial shifting. Many recent algorithms have evolved in the last few years to overcome the slowness of serial shifting. However, their performance behaviour was not studied deeply. In this paper, the performance behaviour of existing algorithms will be investigated with clarifying comments. Moreover, an enhanced lookup table algorithm that is faster than the existing algorithms is presented and evaluated.
In this paper a new encoding scheme and a software environment, called DAGC, to develop and evaluate genetic clustering algorithms is described. DAGC facilitates experiments with genetic clustering algorithms by providing an extensible library of components to assemble new algorithms or modify existing ones. The algorithms may be executed within the environment on caterpillar or random graphs or class dependency graphs extracted from a given source code. The resultant clustering can be stored in a database, for later analysis. DAGC allows confidence analysis by automatically deriving a consolidated model from different clustering results for a given graph. We also offer a new clustering algorithm, called DAGC. The results of comparing the DAGC algorithm with a well known algorithm, Bunch, are presented.
Atomic Business Models Adapted from (Weill and Vitale, 2002).
High-Level Process for SOARE Approach
GRL Legend Adapted from Liu and Yu (2001) and University of Toronto (2003)
Value Net Integrator Model Goal Pattern
This paper proposes the Strategy-oriented Alignment in Requirements Engineering (SOARE) approach for e-business systems. The primary objective of the SOARE approach is to enable alignment between requirements for e-business systems and the business strategies they are intended to support. The SOARE approach incorporates means for analysing and decomposing business strategy, employing goal modelling both to represent business strategy in a requirements engineering context and to link high-level strategic objectives to low-level requirements through goal refinement. The SOARE approach further describes a basis for deriving and leveraging recurring requirements patterns. This paper proposes a high-level process for the SOARE approach, which is then illustrated via a proof-of-concept case study from the literature.
We present a best effort resource allocation algorithm called RBA for asynchronous real-time distributed systems. The algorithm uses Jensen's benefit functions for expressing application timeliness requirements and proposes adaptation functions to describe the anticipated application workload during future time intervals. Furthermore, RBA considers an adaptation model where subtasks of application tasks may be replicated at run-time for sharing workload increases, and a real-time Ethernet system model where message collisions are deterministically resolved. Given such application, adaptation, and system models, the algorithm's objective is to maximise aggregate application benefit and minimise aggregate missed deadline ratio. Since determining the optimal allocation is computationally intractable, RBA heuristically computes the number of replicas that are needed for task subtasks and their processor assignment such that the resulting allocation is as "close" as possible to the optimal allocation. We also experimentally study RBA's performance under different scheduling and routing algorithms. The experimental results reveal that RBA produces higher aggregate benefit and lower missed deadline ratio under DASA than when the RED algorithm is used for scheduling and routing.
Designing security protocols is a challenging and deceptive exercise. Even small protocols providing straightforward security goals, such as authentication, have been hard to design correctly, leading to the presence of many subtle attacks. Over the years various formal approaches have emerged to analyse security protocols making use of different formalisms. Schneider has developed a formal approach to modelling security protocols using the process algebra CSP (Communicating Sequential Processes). He introduces the notion of rank functions to analyse the protocols. We demonstrate an application of this approach to the Woo-Lam protocol. We describe the protocol in detail along with an established attack on its goals. We then describe Schneider's rank function theorem and use it to analyse the protocol. ACM Classification: C.2.2 (Communication/Networking and Information Technology - Network Protocols - Protocol Verification), D.2.4 (Software Engineering - Software/Program Verification - Formal Methods), D.4.6 (Operating Systems - Security and Privacy Protection - Authentication)
A specific run of the Woo-Lam protocol involving A and B using nonce N b
A rank function for the Woo-Lam protocol Recall that the rank function theorem is defined in terms of general sets R and T. For our analysis, we assign sets R and T to Running.A.B.N B and Commit.B.A.N B respectively R = {Running.A.B.N B } T = {Commit.B.A.N B }
Designing security protocols is a challenging and deceptive exercise. Even small protocols providing straightforward security goals, such as authentication, have been hard to design correctly, leading to the presence of many subtle attacks. Over the years various formal approaches have emerged to analyse security protocols making use of different formalisms. Schneider has developed a formal approach to modelling security protocols using the process algebra CSP (Communicating Sequential Processes). He introduces the notion of rank functions to analyse the protocols. We demonstrate an application of this approach to the Woo-Lam protocol. We describe the protocol in detail along with an established attack on its goals. We then describe Schneider's rank function theorem and use it to analyse the protocol.
This paper proposes a new kind of Secure Anonymous Web Transaction (SAWT) system for anonymous browsing and communication on the Web with high security. In the proposed system, normal users can surf or shop online anonymously while malicious accesses to a Web server can be traced and discovered. The tatter property has not been achieved in other existing systems, which can bring greater fairness for both users and Web servers.
Process support systems, such as workflows, are being used in a variety of domains. However, most areas of application have focused on traditional production-style processes, which are characterised by predictability and repetitiveness. Application in non-traditional domains with highly flexible process is still largely unexplored. Such flexible processes are characterised by lack of ability to completely predefine and/or an explosive number of alternatives. Accordingly we define flexibility as the ability of the process to execute on the basis of a partially defined model where the full specification is made at runtime and may be unique to each instance. In this paper, we will present an approach to building workflow models for such processes. We will present our approach in the context of a non-traditional domain for work/low deployment, which is, degree programs in tertiary institutes. The primary motivation behind our approach is to provide the ability to model flexible processes without introducing non-standard modelling constructs. This ensures that the correctness and verification of the language is preserved. We propose to build workflow schemas from a standard set of modelling constructs and given process constraints. We identify the fundamental requirements for constraint specification and classify them into selection, termination and build constraints. We will detail the specification of these constraints in a relational model. Finally, we will demonstrate the dynamic building of instance specific workflow models on the basis of these constraints.
A careful task assignment in a distributed computer system may reduce the workload of a bottleneck computer. It can decrease the cost of computation if the computer sort selection is applied with the task assignment. The total system performance is another measure that can be minimised by the computer sort selection with the task assignment. An extended problem of task allocation can be formulated as a multiobjective combinatorial optimisation question, which is solved by an adaptive evolutionary algorithm. It is applied by finding the subset of Pareto-optimal solutions. Then, a module scheduling is used to maximise the probability of completing tasks with timing correctness.
This paper reports on the "repackaging" and transfer to industry of research results in Viewpoint-Based Requirements Engineering (VBRE). Our experience, as indicated here, is a particular case of the problems and solutions that small research groups can find in their transfer from theoretical work to practical application. In particular, starting with a formal approach for discrepancy management, we first developed a "light-weight version" of VBRE and empirically validated their most important claims. Then we developed a software tool (discRman) that encapsulates the main points of the approach and, finally, partner organizations used it in small projects. We think that this "theoretical-empirical-light-weight-tool" sequence can be applied in similar situations, especially by small research groups seeking for transfer of their results to industry.
(a) An example of a DTD (b) Tree representation of DTD
RXACL architecture for enforcing XML access control  
In this paper we present query filtering techniques based on bottomup tree automata for XML access control. In our authorization model (RXACL), RDF statements are used to represent security objects and to express the security policy. Our model allows to express and enforce access control on XML trees and their associations. We propose a query-filtering technique that evaluate XML queries to detect disclosure of association-level security objects. A query Q discloses a security object o iff the (tree) automata corresponding to o accepts Q. We show that our schema-level method detects all possible disclosures, i.e., it is complete.
Distributed denial of service (DDoS) attacks on the Internet have become an immediate problem. As DDoS streams do not have common characteristics, currently available intrusion detection systems (IDS) can not de- tect them accurately. As a result, defend DDoS attacks based on current available IDS will dramatically affect legitimate traffic. In this paper, we propose a distributed approach to defend against distributed denial of service attacks by coordinating across the Internet. Unlike traditional IDS, we detect and stop DDoS attacks within the intermediate network. In the proposed approach, DDoS defense systems are deployed in the network to detect DDoS attacks independently. A gossip based communication mechanism is used to exchange information about network attacks between these independent detection nodes to aggregate information about the overall network attacks observed. Using the aggregated information, the individual defense nodes have approximate information about global network attacks and can stop them more effectively and accurately. To provide reliable, rapid and widespread dissemination of attack information, the system is built as a peer to peer overlay network on top of the internet. ACS Classification: C.2(Computer-Communication Networks), D.2(Software Engineering).
In existing delegation models, delegation security entirely depends on delegators and security administrators, for delegation constraint in these models is only a prerequisite condition. This paper proposes an Attribute-Based Delegation Model (ABDM) with an extended delegation constraint consisting of both delegation attribute expression (DAE) and delegation prerequisite condition (CR). In ABDM, a delegatee must satisfy delegation constraint (especially DAE) when assigned to a delegation role. With delegation constraint, a delegator can restrict the delegatee candidates more strictly. ABDM relieves delegators and security administrators of security management work in delegation. In ABDM, a delegator is not allowed to temporarily delegate permissions to a person who does not satisfy the delegation constraint. To guarantee its flexibility and security, an extension of ABDM named ABDM X is proposed. In ABDM X, a delegator can delegate some high level permissions to low level delegatee candidates temporarily, but not permanently.
This paper presents user interface technology, using a glove based menuing system and 3D interaction techniques. It is designed to support applications that allow users to construct simple models of outdoor structures. The construction of models is performed using various 3D virtual reality interaction techniques, as well as using real time constructive solid geometry, to allow users to build up shapes with no prior knowledge of the environment. Previous work in virtual environments has tended to focus mostly on selection and manipulation, but not starting from an empty world. We demonstrate our user interface with the Tinmith-Metro application, designed to capture in city models and street furniture.
This paper presents user interface technology, using a glove based menuing system and D interaction techniques. It is designed to support applications that allow users to construct simple models of outdoor structures. The construction of models is performed using various 3D virtual reality interaction techniques, as well as using real time constructive solid geometry, to allow users to build up shapes with no prior knowledge of the environment. Previous work in virtual environments has tended to focus mostly on selection and manipulation, but not starting from an empty world. We demonstrate our user interface with the Tinmith-Metro application, designed to capture in city models and street furniture.
Some important beginnings of what became the Australian IT Industry can be found in the way the Basset group Trent about teaching those from government and industry the rudiments of automatic computation and data processing with 'hands-on' experience. The early contributions of John Bennett and his colleagues enabled many who attended courses, visited and used SILLIAC, to examine the potential of IT. Many went on to make decisions for their organisations which would hate long-term benefit and impact on the growth of this country as an effective and efficient user of the emerging technology.
This paper examines long term changes in the participation of women in professionally accredited computing degree programs. It reports on the results of three intensive Australia-wide studies of the situation in the mid 1980s, in 1992 and in the late 1990s. The early study painted a detailed and rather depressing picture of women's representation in IT education. It also identified barriers to improvement in the discipline itself, the teaching institutions, and for individuals. The intervening years have seen many attempts to address these barriers in respect of both the attraction to and retention of women in IT courses. The current paper summarises the Australia wide studies and then draws upon recent localised data in contrasting environments and larger scale literature to explore how little things have changed as a result of these intervention programs.
This paper discusses the results of a Critical Success Factors (CSFs) study carried out to determine the key IT management needs of Australian CEOs. In the past several studies to determine the IT management needs have been carried out, but they have aimed mostly at the IT managers, not the CEOs. This study fills this gap, and by comparing the CSFs of the CEOs with those of the IT managers shows the areas of misalignment in the management of IT in Australian enterprises. It is concluded that to achieve ongoing alignment the CEOs and senior executives need to gain management level understanding of IT. But perhaps even more important is for IT managers to develop a business oriented perspective for the success of their enterprise.
Cross Correlation coefficients (CCC) between X and Y 1 = S&P 500 
Cross Correlation coefficients (CCC) between X and Y 2 = Dow Jones 
Cross Correlation coefficients (CCC) between X and Y 3 = NASDAQ 
This paper presents a computational approach for predicting the Australian stock market index – AORD using multi-layer feed-forward neural networks from the time series data of AORD and various interrelated markets. This effort aims to discover an optimal neural network or a set of adaptive neural networks for this prediction purpose, which can exploit or model various dynamical swings and inter-market influences discovered from professional technical analysis and quantitative analysis. Four dimensions for optimality on data selection are considered: the optimal inputs from the target market (AORD) itself, the optimal set of interrelated markets, the optimal inputs from the optimal interrelated markets, and the optimal outputs. Two traditional dimensions of the neural network architecture are also considered: the optimal number of hidden layers, and the optimal number of hidden neurons for each hidden layer. Three important results were obtained: A 6-day cycle was discovered in the Australian stock market; the time signature used as additional inputs provides useful information; and a minimal neural network using 6 daily returns of AORD and 1 daily returns of SP500 plus the day of the week as inputs exhibits up to 80% directional prediction correctness.
This study investigates the impact of business-to-consumer electronic commerce on Australian commercial organisations. Five factors which affect the impact of an information system on an organisation were identified from the literature and tested against measures of business performance, customers, information and operating costs. To examine this, a mail survey of 400 Australian businesses was conducted. Business performance and customer increases were found to be most highly associated with electronic commerce. Operating cost reductions were not associated with electronic commerce adoption. ACM Computing Classification System: K.4 Computers and Society, K.6 Management of Computing and Information Systems.
The starting point for this study was the findings from a previous study of Australian women working in IT. Four major themes that resulted from a study of IT professionals in Queensland were used as the framework for a deeper exploration of the current position of Australian women in IT. These four findings were explored through open-ended interviews with a broader range of Australian women working in IT. The findings of this study revealed the influence of socio-cultural factors on gender in the Australian IT profession. This paper also discusses more recent research on gender and IT, particularly the attempts to address the under-theorisation of this research area and the significance of mentoring. is an Australian technology company founded in May 1999 (just before the crash). In this article, co-founder, Lisa Bowman, writes about life as a start-up in Australia and what makes her team of fourteen confident about the future.
Example of a segmented route presentation via the Web
An example of a segmented route presentation via the Palm
Whereis compared to Coral generated route description Start at Parbury Lane. Follow Parbury Lane until you reach the end. Take a right. Follow Lower Fort Street for 30 metres. Turn to the left at George Street. Follow George Street until you reach your destination.
In this paper we tackle the problem of generating natural route descriptions on the basis of input obtained from a commercially available way-finding system. Our framework and architecture incorporates the use of general principles drawn from the domain of natural language generation. Through examples we demonstrate that it is possible to bridge the gap between underlying data representations and natural sounding linguistic descriptions. The work presented contributes both to the area of natural language generation and to the improvement of way-finding system interfaces.
Top-cited authors
Jesualdo Tomas Fernandez-Breis
  • University of Murcia
Amit Sheth
  • University of South Carolina
Jorge Cardoso
  • University of Coimbra
John A Miller
  • University of Georgia
Robert David Stevens
  • The University of Manchester