Advanced Engineering Informatics

Published by Elsevier
Print ISSN: 1474-0346
Publications
Ontologies are now in widespread use as a means of formalizing domain knowledge in a way that makes it accessible, shareable and reusable. Nevertheless, to many, the nature and use of ontologies are unfamiliar. This paper takes a practical approach – through the use of example – to clarifying what ontologies are and how they might be useful in an important and representative phase of the engineering design process, that of design requirement development and capture.The paper consists of two parts. In the first part ontologies and their use are discussed, and a methodology for developing ontologies is explored. In the second part, three very different types of ontology are developed in accordance with the methodology. Each of the ontologies captures a different conceptual facet of the engineering design domain, described at a quite different level of abstraction than the others. The process of developing ontologies is illustrated in a practical way and the application of these ontologies for supporting the capture of the engineering design requirement is described as a means of demonstrating the general potential of ontologies.
 
The paper presents parts of the development of a spatial query language for building information models. Such a query language enables the spatial analysis of building information models and the extraction of partial models that fulfill certain spatial constraints. Among other features, it includes topological operators, i.e. operators that reflect the topological relationships between 3D spatial objects. The paper presents definitions of the semantics of the topological operators within, contain, touch, overlap, disjoint and equal in 3D space by using the 9-intersection model. It further describes a possible implementation of the topological operators by means of an octree-based algorithm. The recursive algorithm presented in this article relies on a breadth-first traversal of the operands’ octree representations and the application of rules that are based on the color of the octants under examination. Because it successively increases the discrete resolution of the spatial objects employed, the algorithm enables the user on the one hand to handle topological relationships in a fuzzy manner and on the other hand to trade-off between computational effort and the required accuracy. The article also presents detailed investigations on the runtime performance of the developed algorithm.
 
The construction industry lacks solutions for accurately, comprehensively and efficiently tracking the three-dimensional (3D) status of buildings under construction. Such information is however critical to the successful management of construction projects: It supports fundamental activities such as progress tracking and construction dimensional quality control. In this paper, a new approach for automated recognition of project 3D Computer-Aided Design (CAD) model objects in large laser scans is presented, with significant improvements compared to the one originally proposed in Bosché et al. (in press) [11]. A more robust point matching method is used and registration quality is improved with the implementation of an Iterative Closest Point (ICP)-based fine registration step.Once the optimal registration of the project’s CAD model with a site scan is obtained, a similar ICP-based registration algorithm is proposed to calculate the as-built poses of the CAD model objects. These as-built poses are then used for automatically controlling the compliance of the project with respect to corresponding dimensional tolerances.Experimental results are presented with data obtained from the erection of an industrial building’s steel structure. They demonstrate the performance in real field conditions of the model registration and object recognition algorithms, and show the potential of the proposed approach for as-built dimension calculation and control.
 
A spatial query language enables the spatial analysis of building information models and the extraction of partial models that fulfill certain spatial constraints. Among other features, the developed spatial query language includes directional operators, i.e., operators that reflect the directional relationships between 3D spatial objects, such as northOf, southOf, eastOf, westOf, above and below. The paper presents in-depth definitions of the semantics of two new directional models for extended 3D objects, the projection-based and the halfspace-based model, by using point-set theory notation. It further describes the possible implementation of directional operators using a newly developed space-partitioning data structure called slot-tree, which is derived from the objects’ octree representation. The slot-tree allows for the application of recursive algorithms that successively increase the discrete resolution of the spatial objects employed and thereby enables the user to trade-off between computational effort and the required accuracy. The article also introduces detailed investigations on the runtime performance of the developed algorithms.
 
Reuse of existing designs can help companies save up a significant amount of time and money in large-scale product development. Searching out similar CAD models of products is a key to facilitating design reuse. This paper proposes a method for retrieving CAD models of mechanical parts by freehand sketches. Users can draw three 2D outline sketches and a skeleton sketch to express their query intent. The 2D outline sketches describe the geometrical information of models and the skeleton sketch contains the topological information. A relevance feedback mechanism is introduced to combine the two similarity degrees measured by 2D outlines or skeleton into a unified one. Experiments are conducted to evaluate the performance of the proposed method.
 
The increase in the use of 4D management tools in recent years within the construction industry has been phenomenal, partly due to the increasing support available in commercial software packages, and partly in response to a greater demand for efficient construction management. However, successfully implementing a 4D management tool in an engineering firm for use in actual projects remains a challenging task. This paper presents the authors’ experiences of implementing an in-house 4D management tool at a large engineering, procurement and construction (EPC) firm with a long history of design–build projects. A three-stage consulting framework of system evaluation, usability study, and management plan (SUM) was proposed and implemented for a firm of this size, which included three parts: (1) System evaluation: requirement analysis and performance evaluation of both hardware and software components of the 4D tool; (2) Usability study: usability tests and improvement of the 4D tool; and (3) Management plan: workflow re-engineering for the firm to be able to successfully implement and apply the 4D management tool to actual projects. We found that the SUM framework was able to effectively identify major problems when introducing a 4D tool to a large design–build project and helped to minimize its own impact on the firm’s business processes.
 
Four-dimensional models, which are 3D models with an added dimension to represent schedule information, have become an important tool in representing construction processes. These models usually rely on colors to represent the different construction states, such that when an ideal color scheme is used, engineers are able to understand the model and identify the potential problems more easily. However, up to this point, little research has been conducted in this area. This paper presents the selection, examination, and user test (SEUT) procedure, a systematic procedure to determine the ideal color scheme for a 4D model. This procedure can be performed iteratively to obtain the ideal color scheme, which would fit a 4D model according to its construction purposes. After conducting an example case following the proposed procedure, we determined an ideal color scheme for six construction states of a 4D model for plant construction. In total ten color schemes were examined and testing was conducted by 58 users over two iterations. The results show that the SEUT procedure is an effective method for determining color schemes to present 4D models and an ideal color scheme was validated and recommended in this research.
 
The composition of heterogeneous web services is a key aspect of usability and applicability of web services in different application domains such as business applications, healthcare, and e-government. Current research has developed different techniques to achieve effective composition of web services. Unfortunately, they fail to ensure a perfect match in the composition of web services. This paper investigates the composition of web services and how to effectively employ web services in the design activities. Objectives of this work are twofold. Firstly, to proposes a new technique that assists users to resolve a mismatch in the composition of web services. Secondly, to implement, validate, and evaluate the proposed technique within the context of design activities thus establishing a workbench called Service Oriented Design Activities (SODA). SODA provides a web-based design infrastructure that allows loosely coupled design teams to collaborate on different services, and to enable them to resolve any mismatch between heterogeneous design services. Other anticipated advantages include interoperability of design services, improving designer capabilities, and the reduction of product development time.
 
This paper presents a method for securing collaborative design using multi-level modeling. When a team of designers works collaboratively on a 3D assembly model, a component of the assembly is presented in a full detail to those who have full access privileges to the component, but in an abstract level of detail to those who have less access privileges. Such various levels of detail can be created in two phases: volumetric feature removal which is achieved through interactive feature recognition on the CAD model, and multi-resolution mesh construction which is based on polygonal simplification. Appropriate representations of the assembly are extracted by direction of access matrix, and then presented to the users participating in collaborative design. The key issues in developing the secure collaborative design system are discussed, and the implementation results are reported.
 
This paper presents a method to determine if a usable wheelchair accessible route in a facility exists using motion-planning techniques. We use a ‘performance-based’ approach to predict the performance of a facility design against requirements of a building code. This approach has advantages over the traditional ‘prescriptive’ code-based approach for assessing acceptability of designs, which is normal practice today for assessing wheelchair accessibility. The prescriptive method can be ambiguous, contradictory, complex, and unduly restrictive in practice, and it can be ad hoc and difficult to implement as a computer application. The performance-based approach directly models the actual possible behaviors of an artifact (in this case, wheelchair motion) that are related to the functional intent of the designed system (a building) and (hopefully) to the specification of a prescriptive building code. This paper presents example cases from architectural practice to illustrate the use of robot motion-planning techniques for wheelchair accessibility analysis. This application is an example of using modern computational methods in support of knowledge-intensive engineering. The simulation method has broad applicability within engineering design. We illustrate and discuss how to analyze virtual simulations of the detailed behavior of a designed artifact in order to assess its use by intended users.
 
With the increasing interest and emphasis on customer demands in new product development, customer requirements elicitation (CRE) and evaluation have secured a crucial position in the early stage of product conceptualization. As such, it is highly desired to have a systematic strategy for soliciting customer requirements effectively, and subsequently, analyzing customer orientations quantitatively. For this purpose, a customer-oriented approach has accordingly been proposed and investigated in this study. It concerns both breadth and depth perspectives of customer requirements acquisition as well as customer and marketing analysis. This prototype system comprises two interrelated components, namely, the CRE and the customer/marketing analysis (CMA) modules. The process starts from the voice of customers and ends with the identified opportunities from marketing analysis. In the prototype system, the laddering technique has been employed to enable CRE via the so-called customer attributes hierarchy (CAM). In addition, the adaptive resonance theory, particularly ART2, neural network has been applied as a toolkit for further customer and marketing analysis. A case study on wood golf club design is used to demonstrate and validate this approach.
 
Architecture, engineering, and construction (AEC) projects are characterized by a large variation in requirements and work routines. Therefore, it is difficult to develop and implement information systems to support projects. To address these challenges, this paper presents a project-centric research and development methodology that combines ethnographic observation of practitioners working in local project organizations to understand their local requirements and the iterative improvement of information systems directly on projects in small action research implementation cycles. The paper shows the practical feasibility of the theoretical methodology using cases from AEC projects in North America and Europe. The cases provide evidence that ethnographic–action research is well suited to support the development and implementation of information systems. In particular, the paper shows that the method enabled researchers on the cases to identify specific problems on AEC projects and, additionally, helped these researchers to adapt information systems accordingly in close collaboration with the practitioners working on these projects.
 
A multi-layer perceptron network is made adaptive by weight updating using the extended Kalman filter (EKF). When the network is used as a model for a non-linear plant, the model can be on-line adapted with input/output data to capture system time-varying dynamics and consequently used in adaptive control. The paper describes how the EKF algorithm is used to update the network model and gives the implementation procedure. The developed adaptive model is evaluated for on-line modelling and model inversion control of a simulated continuous-stirred tank reactor. The modelling and control results show the effectiveness of model adaptation to system disturbance and a global tracking control.
 
The paper proposes an adaptive web system—that is, a website that is capable of changing its original design to fit user requirements. For the purpose of improving shortcomings of the website, and also to make it much easier for users to access information, the system analyzes user browsing patterns from their access records. This paper concentrates on the operating-efficiency of a website—that is, the efficiency with which a group of users browse a website. By achieving high efficiency, users spend less operating cost to accomplish a desired user goal. Based on user access data, we analyze each user's operating activities as well as their browsing sequences. With this data, we can calculate a measure of the efficiency of the user's browsing sequences. The paper develops an algorithm to accurately calculate this efficiency and to suggest how to increase the efficiency of user operations. This can be achieved in two ways: (i) by adding a new link between two web pages, or (ii) by suggesting to designers to reconsider existing inefficient links so as to allow users to arrive at their target pages more quickly. Using this algorithm, we develop a prototype to prove the concept of efficiency. The implementation is an adaptive website system to automatically change the website architecture according to user browsing activities and to improve website usability from the viewpoint of efficiency.
 
In this paper, a novel immunized reinforcement adaptive learning mechanism employing a behavior-based knowledge and the on-line adapting capabilities of the immune system is proposed and applied to an intelligent mobile robot. Rather than building a detailed mathematical model of immune systems, we try to explore principles in the immune system focusing on its self-organization, adaptive capability and immune memory. Two levels of the immune system, underlying the ‘micro’ level of cell interactions, and emergent ‘macro’ level of the behavior of the system are investigated.To evaluate the proposed immunized architecture, a ‘food foraging work’ simulation environment containing a mobile robot, foods, with/without obstacles is created to simulate the real world. The simulation results validate several significant characteristics of the immunized architecture: adaptability, learning, self-organizing, and stable ecological niche approaching.
 
This paper deals with quality of service, defined at the application level, with respect to business constraints expressed in terms of business processes. We present a set of adaptive methods and rules for routing messages in an integration infrastructure which yields a form of autonomic behavior, namely the ability to dynamically optimize the flow of messages in order to comply with SLAs according to business priorities. EAI (Enterprise Application Integration) infrastructures may be seen as component systems that exchange asynchronous messages over an application bus, under the supervision of a processflow engine that orchestrates the messages. The QoS (Quality of Service) of the global IT system is defined and monitored with SLAs (Service Level Agreements) that apply to each business process. The goal of this paper is to propose routing strategies for message handling that maximize the ability of the EAI system to meet these requirements in a self-adaptive and self-healing manner, i.e., its ability to cope with sudden variations of the event flow or temporary failures of a component system. These results are a first contribution towards deployment of autonomic computing concepts into BPM (Business Process Management) architectures. This approach marks a departure from previous approaches in which QoS constraints are pushed to the lower level (e.g., the network). Although the techniques, such as adaptive queuing, are similar, managing QoS at the business process level yields more flexibility and robustness.
 
One of the primary motivations behind autonomic computing (AC) is the problem of administrating highly complex systems. AC seeks to solve this problem through increased automation, relieving system administrators of many burdensome activities. However, the AC strategy of managing complexity through automation runs the risk of making management harder, if not designed properly. Field studies of current system administrator work practices were performed to inform the design of AC systems from the system administrator's perspective, particularly regarding four important activities: collaboration and coordination, rehearsal and planning, maintaining situation awareness, and managing multitasking, interruptions, and diversions. Based on these studies, guidelines for designing usable AC systems that support these activities effectively are provided.
 
A peer-to-peer (p2p) based service flow management system, SwinDeW-S, could support decentralised Web service composition, deployment and enactment. However, traditional workflow definition languages, such as extended XPDL and service-oriented BPEL4WS, have become insufficient to specify business process semantics, especially the descriptions of inputs, outputs, preconditions and effects. In this paper, we propose a novel solution based on OWL-S, a semantic Web ontology language that leverages service discovery, invocation and negotiation more effectively. The enhanced SwinDeW-S architecture is adapted with advanced ontology-based service profiles, and it takes advantage of a well-developed profile generation tool, which translates the BPEL4WS process models to the OWL-S profiles. As a result, in a new prototype equipped with both BPEL4WS and OWL-S, communications and coordination among service flow peers have become better organised and more efficient.
 
This article describes research conducted to gather empirical evidence on size, character and content of the option space in building design projects. This option space is the key starting point for the work of any climate engineer using building performance simulation who is supporting the design process. The underlying goal is to strengthen the role of advanced computing in building design, especially in the early conceptual stage, through a better integration of building performance simulation tools augmented with uncertainty analysis and sensitivity analysis. Better integration will need to assist design rather than automate design, allowing a spontaneous, creative and flexible process that acknowledges the expertise of the design team members. This research investigates and contrasts emergent option spaces and their inherent uncertainties in an artificial setting (student design studios) and in real-life scenarios (commercial design project case studies). The findings provide empirical evidence of the high variability of the option space that can be subjected to uncertainty analysis and sensitivity analysis.
 
An implicit human–machine interaction framework that is sensitive to human anxiety is presented. The overall goal is to achieve detection and recognition of anxiety based on physiological signals. This involves building an anxiety-recognition system capable of interpreting the information contained in physiological processes to predict the probable anxiety state. Since anxiety plays an important role in various human–machine interaction tasks and can be related to task performance, the presented anxiety-recognition methods can be potentially applied to the design of advanced machines and engineering systems capable of intelligent decision-making while interacting with humans. Regression tree and fuzzy logic methodologies have been investigated for the above task. This paper presents the results of applying these two methods and discusses their comparative merits. Three human participant experiments were designed and trials were conducted with five participants. The experimental results demonstrate the feasibility of the proposed anxiety-recognition methods. To the best of our knowledge, our work is the first consolidated effort at fusing multiple physiological indices for robust, real-time detection of anxiety using pattern recognition methods such as fuzzy logic and regression trees.
 
This paper presents a generalized agent-based framework that uses negotiation to dynamically and optimally schedule events. Events can be created dynamically by any active agent in the environment. Each event may potentially require collaboration or resources from one or more other agents. The allocation of resources to the event will be negotiated iteratively until a compromise is found. The framework consists of a user preference model, an evaluation or utility function, and a negotiation protocol. The negotiation protocol is used to implement a distributed negotiation algorithm, called Nstar (N∗). N∗ is based conceptually on the A∗ algorithm for optimal search but extended for distributed negotiation. N∗ is a general distributed negotiation algorithm that makes use of negotiation strategies to find solutions that meet different negotiation objectives. For example, it can use a utility optimizing strategy to find the solution that maximizes average utilities of individual agents. Alternatively, it can select a time optimizing strategy to locate a ‘quick’ feasible solution. The negotiation protocol also performs conflict resolution using a form of iterative repair that renegotiates events which have conflicts. A special case of this framework was used in our MAFOA (mobile agents for office automation) environment to perform agent-based meeting scheduling. A computer simulation test bed was built to simulate the scheduling of hundreds of randomly generated meetings using our N∗ algorithm.
 
In the numerous existing studies dealing with multi-agent robot systems, the systems are positioned on the crossover area of robotics and distributed autonomous systems. Multi-agent robots perform many tasks, which are classified into six types according to the dimension of the goal state and the number of iterations of the tasks. This paper surveys earlier studies on multi-agent robots for each type, such as multi-robot motion-planning algorithms and exploration algorithms of multiple robots. The tasks that multi-agent robots can perform are becoming increasingly more complex as they move from single, one-time tasks to those involving many iterations. This study is an investigation of the current trends and the potentials for multi-agent robot systems.
 
The ability of agents to learn is of growing importance in multi-agent systems. It is considered essential to improve the quality of peer to peer negotiation in these systems. This paper reviews various aspects of agent learning, and presents the particular learning approach—Bayesian learning—adopted in the MASCOT system (multi-agent system for construction claims negotiation). The core objective of the MASCOT system is to facilitate construction claims negotiation among different project participants. Agent learning is an integral part of the negotiation mechanism. The paper demonstrates that the ability to learn greatly enhances agents' negotiation power, and speeds up the rate of convergence between agents. In this case, learning is essential for the success of peer to peer agent negotiation systems.
 
Agent technology has been recognized as a promising paradigm for next generation manufacturing systems. Researchers have attempted to apply agent technology to manufacturing enterprise integration, enterprise collaboration (including supply chain management and virtual enterprises), manufacturing process planning and scheduling, shop floor control, and to holonic manufacturing as an implementation methodology. This paper provides an update review on the recent achievements in these areas, and discusses some key issues in implementing agent-based manufacturing systems such as agent encapsulation, agent organization, agent coordination and negotiation, system dynamics, learning, optimization, security and privacy, tools and standards.
 
In today's competitive market, production costs, lead time and optimal machine utilization are crucial values for companies. Since machine or process breakdowns severely limit their effectiveness, methods are needed to predict products' life expectancy. Furthermore, continuous assessment and prediction of product's performance could also enable a collaborative product life-cycle management in which products are followed, assessed and improved throughout their life-cycle. Finally, information about the remaining life of products and their components is crucial for their disassembly and reuse, which in turn leads to a more efficient and environmentally friendly usage of products and resources. Development of the Watchdog Agent™ answers the aforementioned needs through enabling multi-sensor assessment and prediction of performance of products and machines.
 
Agents in a multi-agent system (mAS) could interact and cooperate in many different ways. The topology of agent interaction determines how the agents control and communicate with each other, what are the control and communication capabilities of each agent and the whole system, and how efficient the control and communications are. In consequence, the topology affects the agents' ability to share knowledge, integrate knowledge, and make efficient use of knowledge in MAS. This paper presents an overview of four major MAS topologic models, assesses their advantages and disadvantages in terms of agent autonomy, adaptation, scalability, and efficiency of cooperation. Some insights into the applicability for each of the topologies to different environment and domain specific applications are explored. A design example of the topological models to an information service management application is attempted to illustrate the practical merits of each topology.
 
Intelligent agents provide a means to integrate various manufacturing software applications. The agents are typically executed in a computer-based collaborative environment, referred to as a multi-agent system. The National Institute of Standards and Technology (NIST) has developed a prototype multi-agent system supporting the integration of manufacturing planning, predictive machining models, and manufacturing control. The agents within this platform have access to a knowledge base, a manufacturing resource database, a numerical control programming system, a mathematical equation solving system, and a computer-aided design system. Intelligence has been implemented within the agents in rules that are used for process planning, service selection, and job execution. The primary purposes for developing such a platform are to support the integration of predictive models, process planning, and shop floor machining activities and to demonstrate an integration framework to enable the use of machining process knowledge with higher-level manufacturing applications.
 
Visual integration of the status of constraints. In this example, two of them are fulfilled (marked by the tick) and two are unfulfilled, which is marked by an x . Further the relation currently holding is displayed to indicate how far the current design is off. 
A partially finished design of the proposed annex showing the main rooms (a), and their spatial relations (b).
The generated route graph (a), and an overly complex path leading from the lobby to the project lab (b).
The initial updated floor plan. The spatial constraints are met and wayfinding complexity is reduced to a small extent. 
The final step in updating the floor plan. The spatial constraints are still met (a), and the route graph illustrates the reduced wayfinding complexity (b).
At any given step in the architectural design process, a designer can usually only consider a small subset of the actions that can be applied to a design along with the consequences of those actions on the overall design process. Computer-based design tools can enable humans to operate more efficiently in this process. In the end, the design product (i.e., a built environment) is meant to be used by people other than the designer. Taking the users’ perspective into account while creating a layout is crucial to not only creating an environment that fulfills all design constraints but that is also usable in everyday life. We present CoSyCAD, a program that can be used to assist architects in the layout of a floor plan; it simultaneously analyzes the cognitive complexity of routes through an indoor environment, thereby enabling direct feedback on a layout’s usability. We provide a sample scenario that utilizes the program and discuss further possible enhancements.
 
This paper deals with the design and implementation of radio-frequency identification (RFID) based cargo monitoring system which supports tracking and tracing in air-cargo operation. In order to apply a proper RFID technology, firstly we studied RF operational environment and tested different RFID frequencies. After finding a right technology (i.e. frequency), we designed and implemented tracking and tracing system applying EPC networks. We believe that our research will bring a guideline for developing RFID based tracking system for cargo operation.
 
Based on the analysis of semantic transmission scenarios in aircraft tooling design as well as the combination of software agent, PDM and CAD technologies, a semantics-based approach for collaborative aircraft tooling design is proposed. This approach combines heuristic reasoning with semantic web ontology to facilitate the knowledge representation and reuse, and solve challenging issues such as semantic transmission between aircraft components and related tooling, between aircraft tooling and inventories, and among users participating in a new aircraft product development process. A pilot software system has been developed and partially deployed in a large-scale aircraft manufacturing enterprise.
 
Current reactive and standalone network security products are not capable of withstanding the onslaught of diversified network threats. As a result, a new security paradigm, where integrated security devices or systems collaborate closely to achieve enhanced protection and provide multi-layer defenses is emerging. In this paper, we present the design of a collaborative architecture for multiple intrusion detection systems to work together to detect real-time network intrusions. The detection is made more efficient and effective by using collaborative intelligent agents, relevant knowledge base and combination of multiple detection sensors. The architecture is composed of three parts: Collaborative Alert Aggregation, Knowledge-based Alert Evaluation and Alert Correlation. The architecture is aimed at reducing the alert overload by correlating results from multiple sensors to generate condensed views, reducing false positives by integrating network and host system information into the evaluation process and correlating events based on logical relations to generate global and synthesized alert report. The architecture is designed as a layer above intrusion detection for post-detection alert analysis and security actions. The first two parts of the architecture have been implemented and the implementation results are presented in this paper.
 
This paper addresses multi-objective optimization from the viewpoint of real-world engineering designs with lots of specifications, where robust and global optimization techniques need to be applied. The problem used to illustrate the process is the design of non-linear control systems with hundreds of performance specifications. The performance achieved with a recent multi-objective evolutionary algorithm (MOEA) is compared with a proposed scheme to build a robust fitness function aggregation. The proposed strategy considers performances in the worst situations: worst-case combination evolution strategy (WCES), and it is shown to be robust against the dimensionality of specifications. A representative MOEA, SPEA-2, achieved a satisfactory performance with a moderate number of specifications, but required an exponential increase in population size as more specifications were added. This becomes impractical beyond several hundreds. WCES scales well against the problem size, since it exploits the similar behaviour of magnitudes evaluated under different situations and searches similar trade-offs for correlated objectives. Both approaches have been thoroughly characterized considering increasing levels of complexity, different design problems, and algorithm configurations.
 
This paper presents the application of path planning in construction sites according to multiple objectives. It quantitatively evaluates the performance of three optimisation algorithms namely: Dijkstra, A∗, and Genetic algorithms that are used to find multi-criteria paths in construction sites based on transportation and safety-related cost. During a construction project, site planners need to select paths for site operatives and vehicles, which are characterised by short distance, low risks and high visibility. These path evaluation criteria are combined using a multi-objective approach. The criteria can be optimised to present site planners with the shortest path, the safest path, the most visible path or a path that reflects a combination of short distance, low risk and high visibility. The accuracy of the path solutions and the time complexities of the optimisation algorithms are compared and critically analysed.
 
Evolutionary algorithms (EAs) are stochastic search methods that mimic the natural biological evolution and/or the social behavior of species. Such algorithms have been developed to arrive at near-optimum solutions to large-scale optimization problems, for which traditional mathematical techniques may fail. This paper compares the formulation and results of five recent evolutionary-based algorithms: genetic algorithms, memetic algorithms, particle swarm, ant-colony systems, and shuffled frog leaping. A brief description of each algorithm is presented along with a pseudocode to facilitate the implementation and use of such algorithms by researchers and practitioners. Benchmark comparisons among the algorithms are presented for both continuous and discrete optimization problems, in terms of processing time, convergence speed, and quality of the results. Based on this comparative analysis, the performance of EAs is discussed along with some guidelines for determining the best operators for each algorithm. The study presents sophisticated ideas in a simplified form that should be beneficial to both practitioners and researchers involved in solving optimization problems.
 
Simulation-based optimization can assist green building design by overcoming the drawbacks of trial-and-error with simulation alone. This paper presents an object-oriented framework that addresses many particular characteristics of green building design optimization problems such as hierarchical variables and the coupling with simulation programs. The framework facilitates the reuse of code and can be easily adapted to solve other similar optimization problems. Variable types supported include continuous variables, discrete variables, and structured variables, which act as switches to control a number of sub-level variables. The framework implements genetic algorithms to solve (1) unconstrained and constrained single objective optimization problems, and (2) unconstrained multi-objective optimization problems. The application of this framework is demonstrated through a case study which uses a multi-objective genetic algorithm to explore the trade-off relationship between life-cycle cost and life-cycle environmental impacts for a green building design.
 
A virus coevolutionary partheno-genetic algorithm (VEPGA), which combined a partheno-genetic algorithm (PGA) with virus evolutionary theory, is proposed to place sensors optimally on a large space structure for the purpose of modal identification. The VEPGA is composed of a host population of candidate solutions and a virus population of substrings of host individuals. The traditional crossover and mutation operators in genetic algorithm are repealed and their functions are implemented by particular partheno-genetic operators which are suitable to combinatorial optimization problems. Three different optimal sensor placement performance index, one aim on the maximization of linear independence, one aim on the maximization of modal energy and the last is a combination of the front two indices, have been investigated. The algorithm is applied to two examples: sensor placement for a portal frame and a concrete arc dam. Results show that the proposed VEPGA outperforms the sequential reduction procedure (SRP) and PGA. The combined performance index makes an excellent compromise between the linear independence aimed index and the modal energy aimed index.
 
In this paper, we propose a dominance-based selection scheme to incorporate constraints into the fitness function of a genetic algorithm used for global optimization. The approach does not require the use of a penalty function and, unlike traditional evolutionary multiobjective optimization techniques, it does not require niching (or any other similar approach) to maintain diversity in the population. We validated the algorithm using several test functions taken from the specialized literature on evolutionary optimization. The results obtained indicate that the approach is a viable alternative to the traditional penalty function, mainly in engineering optimization problems.
 
Properties of Digital Business Agents (cf. {Wooldridge #1759})  
Interaction relations in the Service Market
Network topology  
Realizing economic concepts in a technical implementation  
Future ‘on-demand’ computing systems, often depicted as potentially large scale and complex Service-Oriented Architectures, will need innovative management approaches for controlling and matching services demand and supply. Centralized optimization approaches reach their bounds with increasing network size and number of nodes. The search for decentralized approaches has led to build on self-organization concepts like Autonomic Computing, which draw their inspiration from Biology. This article shows how an alternative self-organization concept from Economics, the Catallaxy concept of F.A. von Hayek, can be realized for allocating service supply and demand in a distributed ‘on-demand’ web services network. Its implementation using a network simulator allows evaluating the approach against a centralized resource broker, by dynamically varying connection reliability and node density in the network. Exhibiting Autonomic Computing properties, the Catallaxy realization outperforms a centralized broker in highly dynamic environments.
 
In a competitive and globalized business environment, the need for the green products becomes stronger. To meet these trends, environmental impact assessment besides delivery, cost and quality of products should be considered as an important factor in new product development stage. In this paper, a knowledge-based approximate life cycle assessment system (KALCAS) is developed to assess the environmental impacts of product design alternatives. It aims at improving the environmental efficiency of a product using artificial neural networks, which consist of high-level product attributes and LCA results. The overall framework of a collaborative design environment involving KALCAS is proposed, using engineering solution CO™ based on the distributed object-based modeling and evaluation (DOME) system. This framework allows users to access the product data and other related information on a wide variety of application. This paper explores an approximate LCA of product design alternatives represented by solid models in a collaborative design environment.
 
Ubiquitous environments like future office buildings that partly or fully implement a flexible office organization require a sophisticated software system that is highly dynamic, scalable, context-aware, self-configuring, self-optimizing, and self-healing. We propose an autonomic middleware approach for such ubiquitous indoor environments and demonstrate the software by our Smart Doorplate Project.The middleware uses an intensive monitoring on different levels to collect the information needed for the metrics to calculate the ‘quality of service provision’ used to trigger the self-mechanisms. The global optimization of the system behavior is realized with local monitoring and the exchange of messages between the AMUN (Autonomic Middleware for Ubiquitous environments) nodes. The information exchange between services is based on typed messages to get a more flexible communication paradigm than those of method invocation.
 
Power system islanding is an effective way to avoid catastrophic wide area blackouts, such as the 2003 North American Blackout. Islanding of large-scale power systems is a combinatorial explosion problem. Thus, it is very difficult to find an optimal solution within reasonable time using analytical methods. This paper presents a new method to solve this problem. In the proposed method, Angle Modulated Particle Swarm Optimization (AMPSO) is utilized to find islanding solutions for large-scale power systems due to its high computational efficiency. First, desired generator groups are obtained using the slow coherency algorithm. AMPSO is then used to optimize a fitness function defined according to both generation/load balance and similarity to the desired generator grouping. In doing so, the resulting islanding solutions provide good static and dynamic stability. Simulations of power systems of different scales demonstrate the effectiveness of the proposed algorithm.
 
This manuscript presents a design for the emergent generation of short-term forecasts in multi-agent coordination and control systems. Food foraging behavior in ant colonies constitutes the main inspiration for the design. A key advantage is the limited exposure of the software agents in the coordination and control system. Each agent corresponds to a counterpart in the underlying system and can be developed and maintained exclusively based on knowledge about its counterpart. This approach to make non-local information available without exposing the software agents beyond their local scope is the research contribution and focus of the discussion in this paper. The research team applies this design to multi-agent manufacturing control systems and to supply network coordination systems, but its intrinsic applicability is broader.
 
This research applies the meta-heuristic method of ant colony optimization (ACO) to an established set of vehicle routing problems (VRP). The procedure simulates the decision-making processes of ant colonies as they forage for food and is similar to other adaptive learning and artificial intelligence techniques such as Tabu Search, Simulated Annealing and Genetic Algorithms. Modifications are made to the ACO algorithm used to solve the traditional traveling salesman problem in order to allow the search of the multiple routes of the VRP. Experimentation shows that the algorithm is successful in finding solutions within 1% of known optimal solutions and the use of multiple ant colonies is found to provide a comparatively competitive solution technique especially for larger problems. Additionally, the size of the candidate lists used within the algorithm is a significant factor in finding improved solutions, and the computational times for the algorithm compare favorably with other solution methods.
 
Some tasks in the construction industry and urban management field such as site selection and fire response management are usually managed by using a Geographical Information System (GIS), as the tasks in these processes require a high level and amount of integrated geospatial information. Recently, a key element of this integrated geospatial information to emerge is detailed geometrical and semantic information about buildings. In parallel, Building Information Models (BIMs) of today have the capacity for storing and representing such detailed geometrical and semantic information. In this context, the research aimed to investigate the applicability of BIMs in geospatial environment by focusing specifically on these two domains; site selection and fire response management. In the first phase of the research two use case scenarios were developed in order to understand the processes in these domains in a more detailed manner and to establish the scope of a possible software development for transferring information from BIMs into the geospatial environment. In the following phase of the research two data models were developed – a Schema-Level Model View and a geospatial data model. The Schema-Level Model View was used in simplifying the information acquired from the BIM, while the geospatial data model acted as the template for creating physical files and databases in the geospatial environment. Following this, three software components to transfer building information into the geospatial environment were designed, developed, and validated. The first component served for acquiring the building information from the BIM, while the latter two served for transforming the information into the geospatial environment.
 
Screenshots from the VR models of the MK3 project.  
In this paper, we present the findings from an extensive study of the use of virtual reality (VR) models in large construction projects. The study includes two parts: The first part presents a quantitative questionnaire designed to investigate how VR models are experienced and assessed by the workforce at a building site. The second part includes a qualitative field survey of how VR models can be applied and accepted by professionals in the design and planning process of a large pelletizing plant. Through mainly studying persons who had little or no experience with advanced information technology (IT), we hoped to reveal the attitudes of the average person working at a construction site rather than of an IT expert. In summary, the study shows that the VR models in both projects have been very useful and well accepted by the users. Today’s information flow is, from a general point of view, considered to be insufficient and the hypothesis is that using VR models in the construction process have the potential to minimize waste of resources and improve the final result.
 
Using the customer relationship management perspective to investigate customer behavior, this study differentiates between customers through customer segmentation, tracks customer shifts from segment to segment over time, discovers customer segment knowledge to build an individual transition path and a dominant transition path, and then predicts customer segment behavior patterns. By using real-world data, this study evaluates the accuracy of predictive models. The concluding remarks discuss future research in this area.
 
As the field of design automation and generative design systems (GDS) evolve, more emphasis is placed on issues of design evaluation. This paper focus on the presentation of different applications of GENE_ARCH, an evolution-based GDS aimed at helping architects to achieve energy-efficient and sustainable architectural solutions. The system applies goal-oriented design, combining a genetic algorithm (GA) as the search engine, with the DOE2.1E building energy simulation software as the evaluation module. Design evaluation is based on energy spent for heating, cooling, ventilation and artificial lighting in the building, and on sustainability issues like greenhouse gas emissions associated with the embodied energy of construction materials. The GA can work either as a standard GA or as a Pareto GA, for multicriteria search and optimization. In order to provide a broad view of the capabilities of the software, different applications are discussed: (1) standard GA: testing and validating the software; (2) standard GA: incorporation of architecture design intentions, using a building by architect Alvaro Siza; (3) Pareto GA: choice of construction materials, considering cost, building energy use, and embodied energy; (4) Pareto GA: application to Siza’s building, considering thermal and lighting behavior separately; (5) standard GA: shape generation with single objective function; (6) Pareto GA: shape generation with multicriteria evaluation; (7) Pareto GA: application to an urban and housing context. Overall conclusions from the different applications are discussed, as well as current challenges and limitations, and directions for further work.
 
Integration and coordination of distributed processes remains a central challenge of construction information technology research. Extant technologies, while capable, are not yet scalable enough to enable rapid customization and instantiation for specific projects. Specifically, the heterogeneity of existing legacy sources together with firms’ range of approaches to process management makes deployment of integrated information technologies impractical. This paper reports on an architecture for distributed process integration named process connectors that addresses heterogeneity in a scalable manner. The process connectors architecture incorporates two key approaches that address heterogeneity over varying time scales. The SEEK: Scalable Extraction of Enterprise Knowledge toolkit is reviewed as a mechanism to discover semantically heterogeneous source data. The SEEK approach complements existing data integration methods for persistent sharing of information. To make use of shared data on a per project basis, a schedule mapping approach is presented that integrates firms’ diverse individual schedules in a unified representation. The schedule mapping approach allows integration of process views that have different levels of detail, while allowing participants to maintain their own view of the process. Collectively, SEEK and the schedule mapping approach facilitate a broad range of analyses to support coordination of distributed schedules. While this paper focuses primarily on schedule process integration, the process connectors architecture is viewed as providing a broad solution to discovery and integration of firms’ process data.
 
Matrix of Project E-communication versus Success Criteria in UAE Environment.
This research investigates the use of modern electronic communication management systems, and how these systems affect the success of construction projects in the United Arab Emirates (UAE). The research starts with a literature survey, and a brief background on how the communication mechanism works; how using these systems influence relationships amongst the project stakeholders, and consequently the projects success. Two case studies are introduced, followed by an analysis of results and conclusions.The first case study, based on action research, employs interactive tools to collect the evidence, including interviews, surveys, document review, and feedback on progress. The study uses success criteria from construction projects in the UAE, previously identified by the authors. This case study has revealed an organisational transformation trend, from functional, towards matrix and project structures. These types of change are taking place after the implementation of project electronic communication management systems into the client organisation, and are enhancing chances of project success.The second case study takes into consideration the co-existence of the new modern project electronic communication systems with the other traditional communication media. It has been shown that such an arrangement works both for the strategic benefit of the projects, and the projects stakeholders.In the areas of improvements to schedule and project control, the current research results are in agreement with pertinent published literature and research findings. However, the benefits for quality control during the design and construction phases of the project, in addition to potential improvements in the health safety and environment (HSE), remain debatable.
 
This paper reports a comparative study of two machine learning methods on Arabic text categorization. Based on a collection of news articles as a training set, and another set of news articles as a testing set, we evaluated K nearest neighbor (KNN) algorithm, and support vector machines (SVM) algorithm. We used the full word features and considered the tf.idf as the weighting method for feature selection, and CHI statistics as a ranking metric. Experiments showed that both methods were of superior performance on the test corpus while SVM showed a better micro average F1 and prediction time.
 
Top-cited authors
Ioannis Brilakis
  • University of Cambridge
Jochen Teizer
  • Technical University of Denmark
Emad Elbeltagi
  • Mansoura University
Tarek Hegazy
  • University of Waterloo
Christian Koch
  • Bauhaus-Universität Weimar