Article

Novel Machine-Learning Model for Estimating Construction Costs Considering Economic Variables and Indexes

Authors:
To read the full-text of this research, you can request a copy directly from the authors.

Abstract

In addition to materials, labor, equipment, and method, construction cost depends on many other factors such as the project locality, type, construction duration, scheduling, and the extent of use of recycled materials. Further, the fluctuation of economic variables and indexes (EV&Is), such as liquidity, wholesale price index, and building services index, causes variation in costs. These changes may increase or reduce the construction cost, are hard to predict, and are normally ignored in the traditional cost estimation computation. This paper presents an innovative construction cost estimation model using advanced machine-learning concepts and taking into account the EV&Is. A data structure is proposed that incorporates a set of physical and financial (P&F) variables of the real estate units as well as a set of EV&Is variables affecting the construction costs. The model includes an unsupervised deep Boltzmann machine (DBM) learning approach along with a softmax layer (DBM-SoftMax), and a three-layer back-propagation neural network (BPNN) or another regression model, support vector machine (SVM). The role of DBM-SoftMax is to extract relevant features from the input data. The role of the BPNN or SVM is to turn the trained unsupervised DBM into a supervised regression network. This combination improves the effectiveness and accuracy of both conventional BPNN and SVM. A sensitivity analysis was performed within the algorithm in order to achieve the best results taking into account the impact of the EV&I factors in different times (time lags). The model was verified using the construction cost data for 372 low- and midrise buildings in the range of three to nine stories. Cost estimation errors of the proposed model were much less than those of both the BPNN-only and SVM-only models, thus demonstrating the effectiveness of the strategies employed in this research and the superiority of the proposed model.

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the authors.

... The accuracy of construction cost estimation is particularly important at the tendering and planning stages of projects [3], where information and time are limited, to ensure the successful completion of construction projects [4]. To absorb the financial risk of the construction budget, a contingency reserve is allocated. ...
... In the last ten years, there has been a significant decrease in the cost of computing power [11], which has also reduced computational costs, enabling recent studies to explore the potential of machine learning (ML) in predicting construction costs [1,3,[12][13][14][15]. Some studies have experimented with simple regression models [14,16], support vector regression [16,17], decision trees [16,18], and neural networks [1,13,15,19], while others have used advanced ensemble ML models [18] and deep learning approaches, such as the deep Boltzmann machine along with a softmax layer [3] and graph convolutional networks [20,21]. ...
... In the last ten years, there has been a significant decrease in the cost of computing power [11], which has also reduced computational costs, enabling recent studies to explore the potential of machine learning (ML) in predicting construction costs [1,3,[12][13][14][15]. Some studies have experimented with simple regression models [14,16], support vector regression [16,17], decision trees [16,18], and neural networks [1,13,15,19], while others have used advanced ensemble ML models [18] and deep learning approaches, such as the deep Boltzmann machine along with a softmax layer [3] and graph convolutional networks [20,21]. While these models are effective in providing cost comparisons between projects, they are not adapted to find relationships among the varying categories in feature inputs, such as different activities with user inputs or user interactions to provide tailored suggestions. ...
Article
Full-text available
Management of contingency reserves involves identifying and prioritizing potential high-cost impact events, serving as a cushion for absorbing the financial risks of projects. Machine learning (ML) models exist for estimating rework costs; however, they cannot recommend related activities that influence contingency costs. This research proposes a novel approach that integrates a construction contingency network, advanced node2vec algorithms, and cosine similarity measures to identify construction activities with similar contingency costs, facilitating the management and planning of rework costs. The proposed system offers tailored recommendations and aids in project management by reducing guesswork using the design science research (DSR) methodology that combines advanced ML techniques with practical construction management strategies to provide a robust tool for navigating the complexities of rework costs. The configured recommendation system achieved an 82% accuracy in its suggestions for critical construction activities with a high-cost impact, along with a 4% loss, demonstrating good generalization. Novelty of this research lies in its first-time development of a recommendation model capable of generating dynamic recommendations of the activities that impact the contingency budget, supporting the existing cost forecasting model.
... Detailed design refers to obtaining building information only after completing architectural plans and structural drawings, so models involving detailed design are not suitable for early-stage estimation of new construction projects (Matel et al., 2022;Ugur et al., 2018). Only three scholars considered external influences such as inflation (Ali et al., 2022), lending rates (Wang et al., 2022), and economic impacts (Rafiei and Adeli, 2018). ...
... For example, RA þ ANN models developed by Wang et al. (2023) only consider a few variables influencing building costs: total floor area, height, and number of floors; Uysal and Sonmez (2023) only utilized five building attributes: location, time of construction, number of floors, building height and floor area. However, apart from the recognition of building-related attributes, economic variables, as external attributes influencing construction costs, will be a key area of research in the future (Rafiei and Adeli, 2018;Saeidlou and Ghadiminia, 2023). Additionally, external social conditions like the occurrence of pandemic diseases (e.g. can lead to an increase in force majeure costs, resulting in lower accuracy in construction cost estimation (Wang et al., 2022). ...
... Looking ahead, there's a compelling avenue for researchers to extend the application of existing ML models in different types of cost estimation. For example, the BPNN as a classic ANN was commonly used in project cost estimation after detailed design: single BPNN (Jiang, 2019;Wang et al., 2018), hybrid PSO þ BPNN (Ye, 2021), hybrid GA þ BPNN (Du and Li, 2017) and hybrid deep boltzmann machine (DBM) þ BPNN (Rafiei and Adeli, 2018). Therefore, using BPNN for conceptual cost estimation in the initial stage of the project has also become an option in the future. ...
Article
Purpose Machine learning (ML) technologies are increasingly being applied in building cost estimation as an advanced method to overcome the challenge of insufficient data and subjective effects of experts. To address the gap of lacking a review of ML applications in building cost estimation, this research aimed to conduct a systematic literature review to provide a robust reference and suggest development pathways for creating novel ML-based building cost prediction models, ultimately enhancing construction project management capabilities. Design/methodology/approach A systematic literature review according to preferred reporting items for systematic reviews and meta-analyses (PRISMA) was adopted using quantitative bibliographic analysis and qualitative narrative synthesis based on the 70 screened publications from Web of Science (WOS) and Scopus databases. The VOSviewer software was used to prepare the thematic focus from the bibliographic data garnered. Findings Based on the results of a bibliographic analysis, current research hotspots and future trends in the application of ML to building cost estimation have been identified. Additionally, the mechanisms behind existing ML models and other key points were analyzed using narrative synthesis. Importantly, the weaknesses of current applications were highlighted and recommendations for future development were made. These recommendations included defining the availability of building attributes, increasing the application of emerging ML algorithms and models to various aspects of building cost estimation and addressing the lack of public databases. Research limitations/implications The findings are instrumental in aiding project management professionals in grasping current trends in ML for cost estimation and in promoting its adoption in real-world industries. The insights and recommendations can be utilized by researchers to refine ML-based cost estimation models, thereby enhancing construction project management. Additionally, policymakers can leverage the findings to advocate for industry standards, which will elevate technical proficiency and ensure consistency. Originality/value Compared to previous research, the findings revealed research hotspots and future trends in the application of ML cost estimation models in only building projects. Additionally, the analysis of the establishment mechanisms of existing ML models and other key points, along with the developed recommendations, were more beneficial for developing improved ML-based cost estimation models, thereby enhancing project management capabilities.
... In the highly competitive construction industry, costs form an essential part of project management (Abbasi et al., 2020;Noorzai et al., 2022). Using a range of methods and instruments, researchers have improved cost overrun estimation to date using neural networks (ElSawy et al., 2011;Ahiaga-Dagbui & Smith, 2012;El-Kholy, 2013;Alex et al., 2010;Karaca et al., 2020;Asghari et al., 2021), machine learning algorithms (Rafiei & Adeli, 2018;Pham et al., 2023), and multiple regression analysis (Amadi, 2023;Coffie et al., 2019). According to Baccarini and Love (2014), preliminary cost estimating must provide the required level of assurance before capital commitments are made. ...
... Engineering and Construction (AEC) domain, yielding promising results (Abdullahi et al., 2024a(Abdullahi et al., , 2024bYamusa et al., 2024). Rafiei and Adeli (2018) used an unsupervised deep Boltzmann machine (DBM) learning approach along with a SoftMax layer (DBM-SoftMax), and a three-layer back-propagation neural network (BPNN) or another regression model, support vector machine (SVM) for cost prediction. The study found that the cost estimation errors of the models were much less than those of both the BPNN-only and SVM-only models. ...
Article
Purpose: A serious concern for construction costs has been the presence of uncertainties in construction operations and how they affect project performance. Several models exist for predicting construction project costs. However, the models overlook the effects of uncertainties on construction costs. This study, therefore, aims to develop a predictive model that considers uncertainty when estimating building renovation project costs. Design/methodology/approach: The study employed project scope factors and 45 uncertainty factors in the model development. SHapley Additive exPlanations (SHAP) was used to reveal the uncertainty factors that had a significant impact on the construction costs and to improve the performance of the model. The study then used the outcome of the sensitivity analysis along with the project scope factors to train and test a prediction model using XGBoost. Findings: The study found Crude Oil Price, Project complexity, Delays in payment, Regulatory requirements and Inappropriate design to have the most significant impact on construction renovation project costs. The XGBoost model for predicting construction renovation project costs has produced promising outcomes with an accuracy of 91.20%. Practical implications: Findings from this study will enable project managers and stakeholders to make informed decisions, optimise resource allocation, and mitigate project risks. Originality: To improve the cost performance of construction renovation projects, it is essential to take uncertainty into account, its impact on predictions, and the accuracy and value of model predictions. In this study, a novel machine learning approach was developed to predict construction cost of renovation projects by leveraging on the uncertainty factors.
... With the development of ML, increasingly innovative and advanced neural networks have been developed, and the main application of neural networks in conceptual cost estimation models is ANN (Cheng, Tsai, and Hsieh 2009;Wang et al. 2023). The BPNN is a classic ANN model that enhances accuracy and performance by automatically modifying the neural network's weights and biases using the backpropagation method to minimize the loss function (El Hakea and Eid 2022; Jiang 2019; Rafiei and Adeli 2018), as shown in Figure 1. ...
... In addition, Wang, Yuan, and Ghafoor (2021) applied the grey (1,1) model to preprocess the input variables by accumulative generation operation, which provides smoother input for BPNN. According to the research by Rafiei and Adeli (2018), the DBM-SoftMax extracts relevant features from the input data, and the BPNN transforms the trained unsupervised DBM into a supervised regression network to improve its effectiveness and accuracy. ...
Article
Full-text available
Accurate conceptual cost estimation is vital in construction project management for effective feasibility studies before project initiation. Relying on rough experiential estimates can lead to significant errors and constrain bid prices, risking financial losses. Machine learning (ML) offers a way to bypass expert input and manual quantity surveying, addressing the challenge of inadequate initial estimation data. However, a demand-oriented conceptual cost estimation model based on ML is lacking in the preliminary design phase to assess cost-influencing factors comprehensively. This research develops an optimal model by comparing the conceptual cost estimation performance of hybrid Dung Beetle Optimizer (DBO) + Back-Propagation Neural Network (BPNN), Genetic Algorithm (GA) + BPNN, Particle Swarm Optimization (PSO) + BPNN, and single BPNN. First, the 20 key input variables affecting conceptual cost estimation were determined by the SHAP and correlation matrix together; Second, after the simulation experiment by using a dataset of 117 general building projects in MATLAB, the DBO+BPNN model with 12 hidden layer neurons achieves the best performance in cost estimation by the performance comparison based on score maker methods with 2 correlation metrics and 5 accuracy metrics. Importantly, implementing this model can provide decision-makers with reliable conceptual cost information, enhancing project success likelihood.
... Table 1 shows the existing techniques for maintenance cost estimation. [4], [5], [7], [8], [11], [17], [21], [29], [30], [31], [34], [36] 12 Deep Learning [2], [4], [5], [6], [19], [20], [28], [32], [33], [37] 10 Data Mining [3], [10], [12], [15], [16], [22], [23] 8 ...
... Precision in cost estimation is essential for achieving cost savings and fostering sustainability utilizing an unsupervised deep Boltzmann machine (DBM) learning method in conjunction with a softmax layer, as well as a three-layer backpropagation neural network (BPNN) or an alternative regression model like support vector machine (SVM), can facilitate advanced data analysis and prediction. [33]. ...
Conference Paper
Construction Equipment has an important role in executing the construction site project effectively and successfully. Every Construction Industry essentially focuses on high initial cost investment in Construction Equipment. The biggest challenge is to operate and maintain high-value heavy equipment. A company's expenditures extend beyond the substantial initial cost of equipment, encompassing significant amounts spent on repairing or replacing malfunctioning machines. This places a substantial burden on overall expenses. The maintenance of construction equipment involves a proactive and scheduled method for repairing equipment, as opposed to a reactive, unscheduled approach after any breakdown. This study highlights the various data-driven, model-driven, and tech-driven structured methodologies used from existing literature for the estimation of maintenance costs. The proposed study advocates for an organized methodology in estimating the maintenance costs from resource quantity prediction of construction equipment using machine learning models, entailing a systematic evaluation and projection of expenses related to maintaining equipment materials, especially from the repair and replacement of spare parts in peak operational conditions. This helps organizations estimate maintenance costs accurately, plan for future expenditures, and optimize the lifespan and performance of construction equipment
... Moreover, numerous methods exist to estimate the total cost of the construction projects. The researchers mainly used neural networks [1,16,[37][38][39][40][41][42][43][44][45][46], different types of regression models [1,31,42,[47][48][49][50][51], case-based reasoning [1,[52][53][54][55], BIM-based cost estimation [56][57][58][59], support vector machine and random forest [31,[60][61][62], genetic algorithm [37,38,52], and Monte Carlo simulation [63,64] to estimate the total construction cost of a project, mainly in early phases. Recent studies indicate that ML approaches have been progressively employed in the construction industry for cost estimation. ...
... Additionally, it is noteworthy that most of the research focused on the early phases of the construction process. Neural networks [1,37,38,40,42,65], regression models [1,31,42,49,51], and BIM-based cost estimation [57,59] were predominantly employed methods mainly in residential structures and high-rise projects to estimate total costs during preliminary stages. Although the significant value of these studies in guiding future research, there is a need for estimating models that can be updated efficiently and swiftly at any moment in the execution phase of a construction project. ...
Article
Full-text available
Estimating the completion cost accurately in the early phases of construction projects is critical to their success. However, cost overruns are almost inevitable due to the risks inherent in construction projects. Hence, the completion cost fluctuates throughout the execution phase and requires periodic updates. There is a need for a prompt and user-friendly completion cost estimation model that accounts for fluctuating risk scores and their impacts on the total cost during the execution phase. Machine learning (ML) techniques could address these requirements by providing effective methods for tackling dynamic systems. The proposed approach aims to predict the cost overrun ratio classes of the completion cost according to the changes in the total risk scores at any time of the project. Six classification algorithms were utilized and validated by employing 110 data points from a globally operating construction company. The performances of the algorithms were evaluated with validation and performance indices. The decision tree classifier surpassed other algorithms. Although there are some research limitations, including risk perception, data gathering restrictions, and selecting proper ML algorithms upon data properties, this research improves the planning abilities of construction executives by providing a cost overrun ratio based on changing total risk scores, facilitating swift and simple assessments at any stage of a construction project’s execution.
... This model had far fewer estimation errors than backpropagation neural networks (BPNNs) alone or support vector machines (SVMs) alone, especially under dynamic economic conditions. The model uses factors such as liquidity, wholesale price indexes, and CCIs to provide a responsive tool for accurate cost estimations [58]. ...
... This analysis provided insights into the most influential documents and sources within the CCI research field. Foundational studies such as Makridakis (2017) [49], Heckmann et al. (2015) [50], and Rafiei and Adeli (2018) [58] have been widely cited for their important influence on AI, supply chain, and machine learning applications, which can be used for future investigation in construction cost estimation. Journals and sources like International Journal of Forecasting, Applied Sciences, and Buildings are highly respected platforms for their huge and effective impact on the research industry, especially in the Construction Cost Index forecasting and estimation field. ...
Article
Full-text available
The Construction Cost Index (CCI) is an important tool that is widely used in construction cost management to monitor cost fluctuations over time. Numerous studies have been conducted on CCI development and forecasting models, including time series, artificial intelligence, machine learning, and hybrid models. Therefore, this study seeks to reveal the complexity of CCI forecasting and identify the leading indicators, trends, and techniques for CCI prediction. A bibliometric analysis was conducted to explore the landscape in the CCI literature, focusing on co-occurrence, co-authorship, and citation analysis. These analyses revealed the frequent keywords, the most cited authors and documents, and the most productive countries. The research topics and clusters in the CCI forecasting process were presented, and directions for future research were suggested to enhance the prediction models. A case study was conducted to demonstrate the practical application of a forecasting model to validate its prediction reliability. Furthermore, this study emphasizes the need to integrate advanced technologies and sustainable practices into future CCI forecasting models. The findings are useful in enhancing the knowledge of CCI prediction techniques and serve as a base for future research in construction cost estimation.
... c. Support Vector Machine (SVM): SVM is a robust algorithm that seeks to identify the optimal hyperplane that parts distinct classes within the feature space (Rafiei and Adeli 2018 (Louk and Tama 2023). Through careful orchestration of decision trees and gradient-based optimization, LightGBM constructs complex models that excel in both accuracy and computational efficiency. ...
... Choi et al. (2021) extended these models to individual cities, developing a framework to choose between univariate and multivariate approaches based on macroeconomic factors. Various techniques have been employed to forecast construction output, including regression (Hwang 2009), ARIMA, vector error correction (VEC) (Jiang andLiu 2014, Shahandashti andAshuri 2016), neural networks (NN) (Shiha et al. 2020), comparison of multiple methods (Lam and Oshodi 2016), and hybrid machine learning methods (Rafiei and Adeli 2018). Univariate sequence-to-sequence (seq2Seq) models predict mid-and long-term Construction Cost Index (CCI) values (Cao and Ashuri 2020), leveraging LSTM cells for their memory capabilities. ...
Preprint
The study predicts construction hiring by considering socioeconomic conditions, political elements, and extreme weather events. The research aims to create a predictive model to help construction companies plan future hiring levels more effectively. By analyzing historical data on construction hiring and related variables, the model forecasts hiring demand at both national and state levels , allowing contractors to adjust their hiring plans and ensure job security and industry stability. The methodology combines deep learning algorithms with ensemble learning to process diverse datasets, including state-specific features and time-dependent variables. The anticipated outcome is a robust predictive framework to alert companies to market disruptions well in advance, moving from a reactive to a proactive approach in managing workforce dynamics. This research contributes to the resilience of the construction workforce, ultimately enhancing job security and stability within the industry. The model outperformed the more commonly used auto-regressive models by achieving a lower overall Mean Absolute Error (MAE) in predictions 6, 12, and 24 steps ahead, and the feature importance results highlight similar patterns among important construction markets in the US.
... Seya and Shiroi [65] claim that deep neural networks with integrated nearest neighbour Gaussian processes have more potential for residential rent pricing prediction. Rafiei and Adeli [66] suggest using a softmax layer in combination with an unsupervised deep Boltzmann machine (DBM) learning technique to extract significant features from the input data in order to forecast construction costs utilising economic factors and indices. The fine-scale spatiotemporal distribution of residential land prices was successfully predicted by Zhang, Hu, Li, Zhang, Yang, and Qu [67] using both the support vector regression method via radial basis functions and the extratrees regression strategy. ...
Article
Full-text available
The Chinese real estate market has grown at such a quick rate over the last few decades, up to the current falling patterns that began at the end of 2021. This difficulty has made it more difficult for the government and investors to predict future property prices effectively. This has occurred as a result of the state of the economy at the moment. In this research, we examine the monthly residential property prices in Hangzhou City, Zhejiang Province, China, using Gaussian process regressions with a variety of kernels and basis functions. This research spans the months of January 2009 through July 2024. We use estimated models in our forecasting efforts. A combination of cross-validation and Bayesian optimisations is used to train these models. The prices that would be seen outside of the sample from June 2021 to July 2024 were successfully predicted by the generated models. These models have an accuracy of 1.0419 per cent for the relative root mean square error. It is plausible that our findings may be used alone or in combination with further projections to formulate theories about fluctuations in residential real estate prices and to conduct supplementary policy analysis.
... Hashemi et al. (2020) thoroughly investigated how machine learning techniques affects construction project cost estimation and analyzed three quantitative methods: statistical models, analogy models, and analysis models. Rafiei and Adeli (2018) utilized advanced machine learning concepts while considering economic variables and indices to propose an innovative model for estimating construction costs. The results showed that this model had significantly lower cost estimation errors than the support vector machine (SVM) and pure BPNN models. ...
Article
Making accurate predictions of the construction cost is essential for ensuring the smooth implementation of projects and guaranteeing economic benefits. The problem to be studied in this article is how to predict construction project costs accurately. The related factors affecting construction project costs are briefly introduced in this paper. A back-propagation neural network (BPNN) was proposed to predict construction engineering costs, and the AdaBoost algorithm was used to improve it. Then, simulation experiments were carried out. It was found that the Adaboost-BPNN algorithm converged to stability faster, and the mean square error was smaller (10-5) when it was stable. Compared with the support vector machine and traditional BPNN algorithms, the AdaBoost-BPNN algorithm had better goodness of fit (0.787) and provided more accurate prediction results for construction engineering costs (mean average error: 0.467, root-mean-square error: 1.118). The novelty of this article lies in utilizing AdaBoost to combine multiple weak predictors into a strong predictor, thereby enhancing the performance of the BPNN algorithm. The contribution lies in improving the predictive performance of the BPNN through the combination principle of AdaBoost, providing an effective reference for accurate cost prediction in construction engineering.
... Additionally, Langenberger et al. [18] applied machine learning techniques, including RF and Gradient Boosting Machines (GBM), to predict high-cost patients in the healthcare sector, demonstrating that tree-based models outperformed other approaches in complex cost prediction tasks. Rafiei and Adeli [19] also successfully applied deep Boltzmann machines and neural networks to improve cost estimation in construction. Despite these advancements, the specific application of these models to the unique challenges of artisanal underground mining remains underexplored. ...
... Many scholars have conducted some research on construction cost prediction by using various improved methods. These methods include artificial neural network [2], integrated BIM and Elman neural networks [3], improved bidirectional long and short term memory (BiLSTM) network [4], random forest optimized by bird swarm algorithm [5], stacking heterogeneous ensemble learning method [6], novel machine learning model considering economic variables and indexes [7], hybrid natural and light gradient boosting model [8], and other methods. Some scholars have made predictions on the cost of green building projects [9,10]. ...
Article
Full-text available
In order to predict the cost of construction projects more accurately for cross-sectional data such as housing costs, a fractional heterogeneous grey model based on the principle of similar information priority was proposed in this paper. The advantages of the proposed model are proved by the stability analysis of the solution. The similarity between predicted samples and existing samples was analyzed, and the priority order of cross-sectional information was distinguished according to the similarity of the index information. The factors affecting the cost of construction projects were sorted by similarity, and the samples with high similarity to predicted samples were ranked first. Since projects with similar influence factors tend to produce similar project costs, such a ranking method can effectively utilize the information of similar projects and help improve prediction accuracy. In addition, compared with the prediction results of other models, it is verified that the method of prioritizing similar information can obtain more accurate prediction results.
... Through all aspects, systematic management of construction projects as well as cost control, the use of a variety of ways and means to reduce the cost of engineering projects, but also constantly optimize the enterprise structure as well as the allocation of funds. Only in this way can we realize the purpose of both reducing construction costs and improving economic efficiency [6]. ...
Article
Full-text available
The increasingly competitive market situation in the construction industry requires that construction enterprises strengthen the cost control of construction projects in order to improve the economic efficiency of enterprises. This paper is based on the improved genetic algorithm to realize the optimization of the cost control strategy of the construction project and effectively overcome the problems of weak cost control consciousness and high material procurement cost in the previous cost control. A construction company is selected as the object for the case study, and the genetic algorithm is used to calculate and analyze the cost control of the construction company. The return on investment, net present value, and internal rate of return are used as indicators to measure the economic benefits before and after optimizing cost control strategies. The improved genetic algorithm calculates that among the cost control factors of the construction company, procurement management, and field operation have a high probability of being selected in the genetic calculation, and the final project cost is 36.7849 million yuan through the genetic algorithm, which achieves the goal of project cost control. At the same time, this paper finds that cost control has a significant positive effect on the improvement of economic efficiency, and the economic efficiency of this construction company before and after the optimization of the cost control strategy is significantly improved.
... Using data scraping technology and research by domestic and foreign scholars on factors influencing residential construction cost predictions, a dataset was preliminarily determined [34][35][36]. This dataset includes 47 residential construction projects in Shanghai, comprising 1 output variable, 'unit cost', and 17 input variables, as shown in Table 1. ...
Article
Full-text available
In the early stages of residential project investment, accurately estimating the engineering costs of residential projects is crucial for cost control and management of the project. However, the current cost estimation of residential engineering in China is primarily carried out by cost personnel based on their own experience. This process is time-consuming and labour-intensive, and it involves subjective judgement, which can lead to significant estimation errors and fail to meet the rapidly developing market demands. Data collection for residential construction projects is challenging, with small sample sizes, numerous attributes, and complexity. This paper adopts a hybrid method combining a grey relational analysis, Lasso regression, and Backpropagation Neural Network (GAR-LASSO-BPNN). This method has significant advantages in handling high-dimensional small samples and multiple correlated variables. The grey relational analysis (GRA) is used to quantitatively identify cost-driving factors, and 14 highly correlated factors are selected as input variables. Then, regularization through Lasso regression (LASSO) is used to filter the final input variables, which are subsequently input into the Backpropagation Neural Network (BPNN) to establish the relationship between the unit cost of residential projects and 12 input variables. Compared to using LASSO and BPNN methods individually, the GAR-LASSO-BPNN hybrid prediction method performs better in terms of error evaluation metrics. The research findings can provide quantitative decision support for cost estimators in the early estimation stages of residential project investment decision-making.
... Rafiei and Adeli [11] introduced a new model for estimating the construction cost based on Economic Variables and Indexes (EV&Is) and machine learning. It incorporated physical and financial real estate unit and EV&Is variables, and was inspected using 372 building constructing cost data, resulting in significantly lower estimation errors. ...
Article
Full-text available
In this study, an innovative method has been proposed for resource allocation among contractors in large construction projects. This method is designed based on a combination of machine learning techniques, fuzzy theory, and auction modeling. Resource allocation in the context of large construction projects, where multiple contractors work simultaneously, can pose a complex problem. Developing an efficient method to address this issue can contribute to improving project performance in terms of cost and construction delays. We have presented a three-stage method for resource allocation in large construction projects. In the first stage, machine learning techniques are utilized to develop two distinct neural network models for predicting costs and delays for each contractor. These models utilize the Genetic Algorithm (GA) to optimize their parameters. In the second stage, a fuzzy model is used, which takes inputs from the neural network models and other contractor-specific features. This model prioritizes the needs of the contractors. Finally, an auction model is employed to fairly distribute the limited project resources between the contractors in need. The implementation results indicate that the proposed method for allocating resources among contractors in large construction projects have achieved MAE and RMSE values of 18.88 and 22.98, respectively, demonstrating a significant performance improvement compared to other proposed methods.
... The series of studies by Rafiei and Adeli from 2016 to 2018 exemplify diverse applications of deep learning in this domain. These applications include the prediction of new housing prices (Rafiei & Adeli, 2016), estimating concrete properties (Rafiei et al, 2017), improving earthquake early warning systems (Rafiei & Adeli, 2017b), and estimating construction costs (Rafiei & Adeli, 2018). These studies not only emphasize the accuracy and efficiency of deep learning models but also pave the way for future research integrating state-of-the-art machine learning concepts into civil engineering. ...
Article
Full-text available
This paper introduces an enhanced you only look once (YOLO) v5s‐D network customized for detecting various categories of damage to post‐fire reinforced concrete (RC) components. These damage types encompass surface soot, cracks, concrete spalling, and rebar exposure. A dataset containing 1536 images depicting damaged RC components was compiled. By integrating ShuffleNet, adaptive attention mechanisms, and a feature enhancement module, the capability of the network for multi‐scale feature extraction in complex backgrounds was improved, alongside a reduction in model parameters. Consequently, YOLOv5s‐D achieved a detection accuracy of 93%, marking an 11% enhancement over the baseline YOLOv5s network. Comparison and ablation tests conducted on different modules, varying dataset sizes, against other state‐of‐the‐art networks, and on public datasets validate the resilience, superiority, and generalization capability of YOLOv5s‐D. Finally, an application leveraging YOLOv5s‐D was developed and integrated into a mobile device to facilitate real‐time detection of post‐fire damaged RC components. This application can integrate diverse fire scenarios and data types, expanding its scope in future. The proposed detection method compensates for the subjective limitations of manual inspections, providing a reference for damage assessment.
... In this context, computer vision [14,15] together with machine learning techniques of artificial intelligence [16], can be a useful mechanism in the efficient collection of images of roads and its subsequent manipulation for classification into the different types of defects previously detailed. In reality, these disciplines have already been useful in other fields of civil engineering for the monitoring and inspection of structures [17], allowing the automation of tasks that required great effort and execution consumption. ...
... Moreover, researchers have been investigating ways to utilize artificial intelligence (AI) for data analysis, decision-making and process optimization. Some of the use cases proposed by the researchers are construction cost estimation (Rafiei and Adeli 2018), building energy consumption prediction (Singaravel et al. 2018), worker activity recognition (Akhavian and Behzadan 2016) and recognition of construction materials' conditions . Large language models (LLMs) (e.g. ...
Article
Full-text available
ChatGPT, a large language model chatbot by OpenAI, has increasingly become a part of employees' day-today activities in numerous industries, including construction, and researchers have looked into this tool since its first release in late 2022 to assist in different fields. One of the benefits of such tools can be related to improved efficiency; however, it raises data privacy and security concerns. Considering the increasing reliance on information technology and operational technology for enhanced productivity, accuracy and quality in projects, these concerns also affect the construction sector. This study presents an overview of the existing literature on the applications of ChatGPT in the construction sector, highlighting its potential to revolutionize various resource-intensive tasks in projects and the related cybersecurity risks. VOSviewer is used for bibliometric analysis of academic publications and to identify the relevant cybersecurity problems. The identified issues are categorized into three main groups and discussed in the context of construction applications. Suggestions are provided to address identified concerns. This paper highlights the importance of ensuring the secure deployment of ChatGPT in the construction sector, a subject that has not been explored in the existing literature.
... Thus, when the information has a significant number of features, the precision of the prediction model and its execution time might be adversely affected, known as the dimensionality curse (Gegic et al., 2019). However, a variety of these features are partial/inappropriate or irrelevant to the sale price (Myers, 2016;Rafiei and Adeli, 2018). ...
Article
Full-text available
Price prediction algorithms propose prices for every product or service according to market trends, projected demand, and other characteristics, including government rules, international transactions, and speculation and expectation. As the dependent variable in price prediction, it is affected by several independent and correlated variables which may challenge the price prediction. To overcome this challenge, machine learning algorithms allow more accurate price prediction without explicitly modeling the relatedness between variables. However, as inputs increase, it challenges the existing machine learning approaches regarding computing efficiency and prediction effectiveness. Hence, this study introduces a novel decision level fusion approach to select informative variables in price prediction. The suggested metaheuristic algorithm balances two competitive objective functions, which are defined to improve the prediction utilized variables and reduce the error rate simultaneously. To generate Pareto optimal solutions, an Elastic net approach is employed to eliminate unrelated and redundant variables to increase the accuracy. Afterward, we propose a novel method for combining solutions and ensuring that a subset of features is optimal. Two various real datasets evaluate the proposed price prediction method. The results support the suggested superiority of the model concerning its relative root mean square error and adjusted correlation coefficient.
Article
Roof inspections are crucial but perilous, necessitating safer and more cost‐effective solutions. While robots offer promising solutions to reduce fall risks, robotic vision systems face efficiency limitations due to computational constraints and scarce specialized data. This study presents real‐time roof defect segmentation network (RRD‐SegNet), a deep learning framework optimized for mobile robotic platforms. The architecture features a mobile‐efficient backbone network for lightweight processing, a defect‐specific feature extraction module for improved accuracy, and a regressive detection and classification head for precise defect localization. Trained on the multi‐type roof defect segmentation dataset of 1350 annotated images across six defect categories, RRD‐SegNet integrates with a roof damage identification module for real‐time tracking. The system surpasses state‐of‐the‐art models with 85.2% precision and 76.8% recall while requiring minimal computational resources. Field testing confirms its effectiveness with F1‐scores of 0.720–0.945 across defect types at processing speeds of 1.62 ms/frame. This work advances automated inspection in civil engineering by enabling efficient, safe, and accurate roof assessments via mobile robotic platforms.
Article
Purpose Since the Chinese real estate market has expanded so quickly over the past 10 years, investors and the government are both quite concerned about projecting future property prices. Design/methodology/approach This work aims to investigate monthly rental price index forecasts of residential properties for ten major Chinese cities from 3M2012 to 5M2020 by using Gaussian process regressions with a diverse variety of kernels and basis functions. The authors conduct forecast exercises through use of Bayesian optimizations and cross-validation. Findings With relative root mean square errors spanning the range of 0.0370%–0.8953%, the constructed models successfully forecast the ten price indices from 6M2019 to 5M2020 out of sample. Originality/value The findings might be used independently or in combination with other projections to create theories about the trends in the rental price index of the residential property and carry out additional policy analysis.
Article
Short‐term prediction of track degradation facilitates flexible and efficient maintenance, thereby meeting the railway system's escalating demands for track safety and smoothness. However, the track condition evolution presents challenges to accurate prediction, with diverse influential factors resulting in heterogeneous degradation patterns across space and time. In a short‐term context, time series derived from historical records are length‐limited, with sparse sampling points complicating feature identification. Actual activities, particularly minor repairs, lack strict periodicity, leading to irregular spans in continuous degradation curves, yielding nonuniform samples. This study leverages dynamic inspection and influential factors to propose an ensemble learning using the Transformer model. The outer framework employs unsupervised learning to group the sections based on specific time periods and track lengths. It assigns fuzzy logic categories to these groups to capture differentiated patterns and guides the division of samples into fuzzy subsets and assigns them to the learners corresponding to each cluster. The loosely coupled structure aids task decomposition and enhances local performance. The inner model refines the Transformer design for a new scenario, introducing a prediction objective transformation based on the interdependencies among multidimensional indicators to strengthen feature extraction. The prediction performance is evaluated using over 2 years of records from 560 km railway lines, offering insights for improving onsite track management.
Article
Accurately estimating traffic volumes in construction work zones is crucial for effective traffic management. However, one of the key challenges transportation agencies face is the limited coverage of continuous count station (CCS) sensors, which are often sparsely located and may not be positioned directly on roads where construction work zones are present. This spatial limitation leads to gaps in traffic data, making accurate volume estimation difficult. Addressing this, our study utilized a custom regularized model and variational autoencoders (VAE) to generate synthetic data that improves traffic volume estimations in these challenging areas. The proposed method not only bridges the data gaps between sparse CCS sensors but also outperforms several benchmark models, as measured by mean absolute percentage error, root mean square error, and mean absolute error. Moreover, the effectiveness of VAE‐augmented models in enhancing the precision and accuracy of traffic volume estimations further underscores the benefits of integrating synthetic data into traffic‐modeling approaches. These findings highlight the potential of the proposed approach to enhance traffic volume estimation in construction work zones and assist transportation agencies in making informed decisions for traffic management.
Article
Full-text available
The Chinese real estate market has expanded at such a quick rate over the last two decades, up to the current decline patterns that began at the end of 2021. As a result, predicting future property prices has become a significant challenge for both the government and investors. Within the scope of this investigation, we investigate quarterly national residential property price indices for China with data sourced from Bank for International Settlements from the second quarter of 2005 to the first quarter of 2024 by using Gaussian process regressions with a variety of kernels and basis functions. For the purpose of model training and conducting forecasting exercises using the estimated models, we make utilisation of cross-validation and Bayesian optimisations based upon the expected improvement per second plus algorithm. Use of Bayesian optimisations could help endow Gaussian process regression models with good flexibility for forecasting into the future. With a relative root mean square error of 0.1291 percent, root mean square error of 0.1816, mean absolute error of 0.1527, and correlation coefficient of 99.901%, the created models were able to reliably anticipate the price indices from the third quarter of 2020 to the first quarter of 2024 out of sample. The constructed Gaussian process regression models also outperform several alternative machine learning models and econometric models. Their forecast performance is robust to different out-of-sample evaluation periods as well. In order to build hypotheses about trends in the residential real estate price index and to carry out more policy research, our findings might be used either alone or in combination with other projections.
Article
Precise three‐dimensional (3D) instance segmentation of indoor scenes plays a critical role in civil engineering, including reverse engineering, size detection, and advanced structural analysis. However, existing methods often fall short in accurately segmenting complex indoor environments due to challenges of diverse material textures, irregular object shapes, and inadequate datasets. To address these limitations, this paper introduces StructNet3D, a point cloud neural network specifically designed for instance segmentation in indoor components including ceilings, floors, and walls. StructNet3D employs a novel multi‐scale 3D U‐Net backbone integrated with ArchExtract, which designed to capture both global context and local structural details, enabling precise segmentation of diverse indoor environments. Compared to other methods, StructNet3D achieved an AP50 of 87.7 on the proprietary dataset and 68.6 on the S3DIS dataset, demonstrating its effectiveness in accurately segmenting and classifying major structural components within diverse indoor environments.
Article
Particle morphology is a crucial factor influencing the mechanical properties of granular materials particularly in infrastructure construction processes where accurate shape descriptors are essential. Accurately measuring three‐dimensional (3D) morphology has significant theoretical and practical value for exploring the multiscale mechanical properties of civil engineering materials. This study proposes a novel approach using multiview (two‐dimensional [2D]) particle images to efficiently predict 3D morphology, making real‐time aggregate quality analysis feasible. A 3D convolutional neural network (CNN) model is developed, which combines Monte Carlo dropout and attention mechanisms to achieve uncertainty‐evaluated predictions of 3D morphology. The model incorporates a convolutional block attention module, involving a two‐stage attention mechanism with channel attention and spatial attention, to further optimize feature representation and enhance the effectiveness of the attention mechanism. A new dataset comprising 18,000 images of 300 natural gravel and 300 blasted rock fragment particles is used for model training. The prediction accuracy and uncertainty of the proposed model are benchmarked against a range of alternative models including 2D CNN, 3D CNN, and 2D CNN with attention, in particular, to the influence of the number of input multiview particle images on the performance of the models for predicting various morphological parameters is explored. The results indicate that the proposed 3D CNN model with the attention mechanism achieves high prediction accuracy with an error of less than 10%. Whilst it exhibits initially greater uncertainty compared to other models due to its increased complexity, the model shows significant improvement in both accuracy and uncertainty as the number of training images is increased. Finally, residual challenges associated with the prediction of more complex particle angles and irregular shapes are also discussed.
Article
Full-text available
Soil classification and analysis are essential for understanding soil properties and serve as a foundation for various engineering projects. Traditional methods of soil classification rely heavily on costly and time-consuming laboratory and in-situ tests. In this study, Support Vector Machine (SVM) models were trained for soil classification using 649 Cone Penetration Test (CPT) datasets, specifically utilizing cone tip resistance (qcq_c) and sleeve friction (fsf_s) as input variables. Pearson correlation and sensitivity analysis confirmed that these variables are highly correlated with the classification results. To enhance classification performance, 25 optimization algorithms were applied, and the models were validated against an independent dataset of 208 CPT records. The results revealed that 23 of the algorithms successfully improved the SVM classification accuracy. Among these, 18 algorithms achieved higher accuracy than the current engineering standard, the “Code for in-situ Measurement of Railway Engineering Geology.” Notably, the Thermal Exchange Optimization (TEO) algorithm resulted in the most significant improvement, increasing the accuracy of the original SVM model by 10% and exceeding the standard by 4.3%. Moreover, the models were thoroughly evaluated using Monte Carlo simulations, confusion matrices, ROC curves, and 10 key performance metrics. In conclusion, integrating evolutionary algorithms with SVM for soil classification offers a promising approach to enhancing the efficiency and accuracy of soil analysis in engineering applications.
Article
Due to the rapid growth of the Chinese housing market over the past ten years, forecasting home prices has become a crucial issue for investors and authorities alike. In this research, utilising Bayesian optimisations and cross validation, we investigate Gaussian process regressions across various kernels and basis functions for monthly residential real estate price index projections for ten significant Chinese cities from July 2005 to April 2021. The developed models provide accurate out-of-sample forecasts for the ten price indices from May 2019 to April 2021, with relative root mean square errors varying from 0.0207% to 0.2818%. Our findings could be used individually or in combination with other projections to formulate theories about the trends in the residential real estate price index and carry out additional policy analysis.
Article
Rock quality designation (RQD) plays a crucial role in the design and analysis of rock engineering. The traditional method of measuring RQD relies on manual logging by geologists, which is often labor‐intensive and time‐consuming. Thus, this study presents an autonomous framework for expeditious RQD estimation based on two‐dimensional corebox photographs. The scale‐invariant feature transform (SIFT) algorithm is employed for rapid image calibration. A K‐Net‐based model with dynamic semantic kernels, conditional on their actual activations, is proposed for rock core segmentation. It surpasses other prevalent models with a mean intersection over union of 95.43%. The automatic RQD estimation error of our proposed framework is only 1.46% compared to manual logging results, demonstrating its exceptional reliability and effectiveness. The robustness of the framework is then validated on an additional test set, proving its potential for widespread adoption in geotechnical engineering practice.
Article
The reliance of contractor selection for specific construction activities on subjective judgments remains a complex decision‐making process with high stakes due to its impact on project success. Existing methods of contractor selection lack a data‐driven decision‐support approach, leading to suboptimal contractor assignments. Here, an advanced node2vec‐based recommendation system is proposed that addresses the shortcomings of conventional contractor selection by incorporating a broad range of quantitative performance indicators. This study utilizes semi‐supervised machine learning to analyze contractor records, creating a network in which nodes represent activities and weighted edges correspond to contractors and their performances, particularly cost and schedule performance indicators. Node2vec is found to display a prediction accuracy of 88.16% and 84.08% when processing cost and schedule performance rating networks, respectively. The novelty of this research lies in its proposed network‐based, multi‐criteria decision‐making method for ranking construction contractors using embedding information obtained from quantitative contractor performance data and processed by the node2vec procedure, along with the measurement of cosine similarity between contractors and the ideal as related to a given activity.
Article
Construction projects require significant funding and are exposed to several risks. Public construction projects require a major proportion of the annual government budget. Their actual cost estimation concerns a known and existing problem for the construction sector, while several project failures in terms of budget extension can be documented around the world. Accurate construction cost predictions are essential in mitigating time-related risks and play a crucial role in the decision-making process for managers. Inaccurate cost estimations can result in investment project disruptions. Research about machine learning (ML) techniques regarding construction cost estimation is intensifying, which aims to develop new ML techniques or update existing ones. This article contains a systematic literature review of ML techniques for construction project cost estimation. This review included an in-depth analysis of 219 studies, which contain the most prominent machine learning techniques. This article attempts to define a classification of the identified ML techniques, with the following criteria: intelligent technique that was followed and the application domain. The taxonomy that was generated contains ML techniques about construction cost estimation and their application, which offers useful guidance for both researchers and practitioners.
Article
Full-text available
Purpose Unlocking the potential of Big Data Analytics (BDA) has proven to be a transformative factor for the Architecture, Engineering and Construction (AEC) industry. This has prompted researchers to focus attention on BDA in the AEC industry (BDA-in-AECI) in recent years, leading to a proliferation of relevant research. However, an in-depth exploration of the literature on BDA-in-AECI remains scarce. As a result, this study seeks to systematically explore the state-of-the-art review on BDA-in-AECI and identify research trends and gaps in knowledge to guide future research. Design/methodology/approach This state-of-the-art review was conducted using a mixed-method systematic review. Relevant publications were retrieved from Scopus and then subjected to inclusion and exclusion criteria. A quantitative bibliometric analysis was conducted using VOSviewer software and Gephi to reveal the status quo of research in the domain. A further qualitative analysis was performed on carefully screened articles. Based on this mixed-method systematic review, knowledge gaps were identified and future research agendas of BDA-in-AECI were proposed. Findings The results show that BDA has been adopted to support AEC decision-making, safety and risk assessment, structural health monitoring, damage detection, waste management, project management and facilities management. BDA also plays a major role in achieving construction 4.0 and Industry 4.0. The study further revealed that data mining, cloud computing, predictive analytics, machine learning and artificial intelligence methods, such as deep learning, natural language processing and computer vision, are the key methods used for BDA-in-AECI. Moreover, several data acquisition platforms and technologies were identified, including building information modeling, Internet of Things (IoT), social networking and blockchain. Further studies are needed to examine the synergies between BDA and AI, BDA and Digital twin and BDA and blockchain in the AEC industry. Originality/value The study contributes to the BDA-in-AECI body of knowledge by providing a comprehensive scope of understanding and revealing areas for future research directions beneficial to the stakeholders in the AEC industry.
Article
In the fields of engineering seismology and earthquake engineering, researchers have predominantly focused on ground motion models (GMMs) for intensity measures. However, there has been limited research on power spectral density GMMs (PSD‐GMMs) that characterize spectral characteristics. PSD, being structure‐independent, offers unique advantages. This study aims to construct PSD‐GMMs using non‐parametric machine learning (ML) techniques. By considering 241 different frequencies from 0.1 to 25.12 Hz and evaluating eight performance indicators, seven highly accurate and stable ML techniques are selected from 12 different ML techniques as foundational models for the PSD‐GMM. Through mixed effects regression analysis, inter‐event, intra‐event, and inter‐site standard deviations are derived. To address inherent modeling uncertainty, this study uses the ratio of the reciprocal of the standard deviation of the total residuals of the foundational models to the sum of the reciprocals of the total residuals of the seven ML GMMs as weight coefficients for constructing a hybrid non‐parametric PSD‐GMM. Utilizing this model, ground motion records can be simulated, and seismic hazard curves and uniform hazard PSD can be obtained. In summary, the hybrid non‐parametric PSD‐GMM demonstrates remarkable efficacy in simulating and predicting ground motion records and holds significant potential for guiding seismic hazard and risk analysis.
Article
A reverse calculation method termed soil and lining physics‐informed neural network (SL‐PINN) is proposed for the estimation of load for tunnel lining in elastic soil based on radial displacement measurements of the tunnel lining. To achieve efficient and accurate calculations, the framework of SL‐PINN is specially designed to consider the respective displacement characteristics of surrounding soil and tunnel lining. A multistep training method based on the meshless characteristics of SL‐PINN is established to promote calculation efficiency. The multistep training method involves increasing the number of collocation points in each calculation step while decreasing the learning rate after scaling of SL‐PINN. The feasibility of SL‐PINN is verified by numerical simulation data and field data. Compared to other inverse calculation methods, SL‐PINN has lower precision requirements for the measurement instrument with the same level of calculation accuracy.
Article
Full-text available
Over the past decade, there has been a dramatic increase in the use of various technologies in the Architecture, Engineering, and Construction sector. Artificial intelligence has played a significant role throughout the different phases of the design and construction process. A growing body of literature recognizes the importance of artificial neural network applications in numerous areas of the construction industry and the built environment, presenting a need to explore the main research themes, attributes, benefits, and challenges. A three-step extensive research method was utilized by conducting a bibliometric search of English language articles and conducting quantitative and qualitative analyses. The bibliometric analysis aimed to identify the current research directions and gaps forming future research areas. The scientometric analysis of keywords revealed diverse areas within the construction industry linked to ANNs. The qualitative analysis of the selected literature revealed that energy management in buildings and construction cost predictions were the leading research topics in the study area. These findings recommend directions for further research in the field, for example, broadening the application ranges of ANNs in the current Construction 4.0 technologies, such as robotics, 3D printing, digital twins, and VR applications.
Article
Full-text available
In the construction industry, traditional methods of cost estimation are inefficient and cannot reflect real-time changes. Modern techniques are essential to create new tools that outperform current cost estimation. This study introduced the Least Square Moment Balanced Machine (LSMBM), AI-based inference engine, to improve construction cost prediction accuracy. LSMBM considers moments to determine the optimal hyperplane and uses the Backpropagation Neural Network (BPNN) to assign weights to each data point. The effectiveness of LSMBM was tested by predicting the construction costs of residential and reinforced concrete buildings. Correlation analysis, PCA, and LASSO were used for feature selection to identify the most relevant variables, with the combination of LSMBM-PCA giving the best performance. When compared to other machine learning models, the LSMBM model achieved the lowest error values, with an RMSE of 0.016, MAE of 0.010, and MAPE of 4.569%. The overall performance measurement reference index (RI) further confirmed the superiority of LSMBM. Furthermore, LSMBM performed better than the Earned Value Management (EVM) method. LSMBM model has proven to enhanced the precision in predicting cost estimates, facilitating project managers to anticipate potential cost overruns and optimize resource allocation, provide information for strategic and operational decision-making processes in construction projects.
Article
The phenomenon of overweight vehicles severely threatens traffic safety and the service life of transportation infrastructure. Rapid and effective identification of overweight vehicles is of significant importance for maintaining the healthy operation of highways and bridges and ensuring the safety of people's lives and property. With the problems of high cost and low efficiency, the traditional vehicle weighing systems can only meet some of the requirements of different scenarios. The development of artificial intelligence technologies, especially deep learning, has greatly enhanced the accuracy and efficiency of computer vision. To this end, the paper proposes a method using computer vision and deep learning for the non‐contact identification of overweight vehicles. By constructing two deep learning models and combining them with the vehicle vibration model and relevant specifications, the weight and maximum allowable weight of the vehicle are obtained to make a comparison for determining overweight. Experimental verification was performed using a two‐axle vehicle as an illustrative example, and the results demonstrate that the proposed method exhibits excellent feasibility and effectiveness. It shows significant potential in real‐world scenarios, laying a research foundation for practical engineering applications. Additionally, it provides a reference for the governance and decision‐making of overweight issues for relevant authorities.
Article
This study proposes an innovative method for achieving autonomous flight to inspect overhead transmission facilities. The proposed method not only integrates multimodal information from novel sensors but also addresses three essential aspects to overcome the existing limitations in autonomous flights of an unmanned aerial vehicle (UAV). First, a novel deep neural network architecture titled the rotational bounding box with a multi‐level feature pyramid transformer is introduced for accurate object detection. Second, a safe autonomous method for the transmission tower approach is proposed by using multimodal information from an optical camera and 3D light detection and ranging. Third, a simple yet accurate control strategy is proposed for tracking transmission lines without necessitating gimbal control because it keeps the UAV's altitude in sync with that of the transmission lines. Systematic analyses conducted in both virtual and real‐world environments confirm the effectiveness of the proposed method. The proposed method not only enhances the performance of autonomous flight but also provides a safe operating platform for inspection personnel.
Article
Full-text available
This study investigated the awareness of the Nigerian construction organisations on some identified ML application areas, and the readiness of the organisations to adopt ML learning in the identified application areas. A comprehensive Literature review was undertaken to identify the application areas of ML, then, a well-structured questionnaire was developed and used to gather relevant data from construction professionals using the snowball sampling method via electronic means. 143 valid responses were obtained, and the gathered data were analysed using arrays of descriptive and inferential analytical tools. The study revealed that the critical applications areas of ML with higher awareness level and adoption readiness in Nigeria are (1) Health and Safety prediction and management, (2) Waste management, (3) Prediction of and management of construction costs, (4) Risk Management, (5) Structural Health Monitoring and Prediction, and (6) Building Life-Cycle assessment and management. Further, a significant statistical difference was observed between the opinions of the participants regarding the awareness and adoption readiness of the various ML application areas. This study identified critical application areas of ML where the awareness and adoption readiness are very high, thus, signalling the preparedness of the Nigerian construction industry (NCI) to embrace ML to drive sustainable construction.
Article
Full-text available
Efficient representation of complex infrastructure systems is crucial for system‐level management tasks, such as edge prediction, component classification, and decision‐making. However, the complex interactions between the infrastructure systems and their spatial environments increased the complexity of network representation learning. This study introduces a novel geometric‐based multimodal deep learning model for spatially embedded network representation learning, namely the regional spatial graph convolutional network (RSGCN). The developed RSGCN model simultaneously learns from the node's multimodal spatial features. To evaluate the network representation performance, the introduced RSGCN model is used to embed different infrastructure networks into latent spaces and then reconstruct the networks. A synthetic network dataset, a California Highway Network, and a New Jersey Power Network were used as testbeds. The performance of the developed model is compared with two other state‐of‐the‐art geometric deep learning models, GraphSAGE and Spatial Graph Convolutional Network. The results demonstrate the importance of considering regional information and the effectiveness of using novel graph convolutional neural networks for a more accurate representation of complex infrastructure systems.
Article
Purpose The purpose of this study is to make property price forecasts for the Chinese housing market that has grown rapidly in the last 10 years, which is an important concern for both government and investors. Design/methodology/approach This study examines Gaussian process regressions with different kernels and basis functions for monthly pre-owned housing price index estimates for ten major Chinese cities from March 2012 to May 2020. The authors do this by using Bayesian optimizations and cross-validation. Findings The ten price indices from June 2019 to May 2020 are accurately predicted out-of-sample by the established models, which have relative root mean square errors ranging from 0.0458% to 0.3035% and correlation coefficients ranging from 93.9160% to 99.9653%. Originality/value The results might be applied separately or in conjunction with other forecasts to develop hypotheses regarding the patterns in the pre-owned residential real estate price index and conduct further policy research.
Article
The occurrence of pavement cracks poses a significant potential threat to road safety, thus the rapid and accurate acquisition of pavement crack information is of paramount importance. Deep learning methods have the capability to offer precise and automated crack detection solutions based on crack images. However, the slow detection speed and huge model size in high‐accuracy models are still the main challenges required to be addressed. Therefore, this research presents a lightweight feature attention fusion network for pavement crack segmentation. This structure employs FasterNet as the backbone network, ensuring performance while reducing model inference time and memory overhead. Additionally, the receptive field block is incorporated to simulate human visual perception, enhancing the network's feature extraction capability. Ultimately, our approach employs the feature fusion module (FFM) to effectively combine decoder outputs with encoder's low‐level features using weight vectors. Experimental results on public crack datasets, namely, CFD, CRACK500, and DeepCrack, demonstrate that compared to other semantic segmentation algorithms, the proposed method achieves both accurate and comprehensive pavement crack extraction while ensuring speed.
Article
Network-wide short-term passenger flow prediction is critical for the operation and management of metro systems. However, it is challenging due to the inherent non-stationarity, nonlinearity, and spatial–temporal dependencies within passenger flow. To tackle these challenges, this paper introduces a hybrid model called multi-scale dynamic propagation spatial–temporal network (MSDPSTN). Specifically, the model employs multivariate empirical mode decomposition to jointly decompose the multivariate passenger flow into multi-scale intrinsic mode functions. Then, a set of dynamic graphs is developed to reveal the passenger propagation law in metro networks. Based on the representation, a deep learning model is proposed to achieve multistep passenger flow prediction, which employs the dynamic propagation graph attention network with long short-term memory to extract the spatial–temporal dependencies. Extensive experiments conducted on a real-world dataset from Chengdu, China, validate the superiority of the proposed model. Compared to state-of-the-art baselines, MSDPSTN reduces the mean absolute error, root mean squared error, and mean absolute percentage error by at least 3.243%, 4.451%, and 4.139%, respectively. Further quantitative analyses confirm the effectiveness of the components in MSDPSTN. This paper contributes to addressing inherent features of passenger flow to enhance prediction performance, offering critical insights for decision-makers in implementing real-time operational strategies.
Article
Maintaining airport runways is crucial for safety and efficiency, yet traditional monitoring relies on manual inspections, prone to time consumption and inaccuracy. This study pioneers the utilization of low‐cost dashcam imagery for the detection and geolocation of airport runway pavement distresses, employing novel deep‐learning frameworks. A significant contribution of our work is the creation of the first public dataset specifically designed for this purpose, addressing a critical gap in the field. This dataset, enriched with diverse distress types under various environmental conditions, enables the development of an automated, cost‐effective method that substantially enhances airport maintenance operations. Leveraging low‐cost dashcam technology in this unique scenario, our approach demonstrates remarkable potential in improving the efficiency and safety of airport runway inspections, offering a scalable solution for infrastructure management. Our findings underscore the benefits of integrating advanced imaging and artificial intelligence technologies, paving the way for advancements in airport maintenance practices.
Conference Paper
Construction projects are prone to experience significant delays and cost overruns due to uncontrollable risks raised by their complex, unique, and uncertain nature. Conventional Risk Management methods have proven inefficient, time-consuming, and highly subjective, making exploring innovative and data-driven solutions essential. Artificial Intelligence (AI) is revolutionizing the construction industry by offering improved, optimized, and automatized Project Management solutions, which can benefit existing RM processes significantly. This study investigates the application of various Machine Learning algorithms for delay and cost overrun risk prediction in construction projects. A case study involving NYC school construction projects is used to train and evaluate algorithms such as Decision Trees, Artificial Neural Networks, Extreme Gradient Boosting, and Linear and Ridge regressions. The ultimate goal of this research is to conduct a comparative analysis between the performances and prediction precision of different ML algorithms for delays and cost overruns, two of the most significant construction risks, concerning each algorithm’s structure and learning process. The results of this study provide automated and precise predictions of risks in new construction projects while also contributing valuable insights into the potential and benefits of ML applications in the construction industry.
Article
Full-text available
Masonry construction is labor-intensive. Processes require a large number of crews made up of masons with diverse skills, capabilities, and personalities. Often crews are reassembled and the superintendent in the site is responsible for allocating crews to balance between the complexity of the job and the need for quality and high production rates. However, the masonry industry still faces increased time and low productivity rates that result from inefficiencies in crew allocation. This article presents a system for efficient crew allocation in the masonry industry formulated as a mixed-integer program. The system takes into consideration characteristics of masons and site conditions and how to relate these to determine the right crew for the right wall to increase productivity. With the system, superintendents are not only able to identify working patterns for each of the masons but also optimal crew formation, completion times, and labor costs. To validate the model, data from a real project in the United States is used to compare the crew allocation completed by the superintendent onsite with the one proposed by the system. The results showed that relating characteristics of workers with site conditions had a substantial impact on reducing the completion time to build the walls, maximizing the utilization of masons, and outlining opportunities for concurrent work.
Article
Full-text available
Data classification in presence of noise can lead to much worse results than expected for pure patterns. In this paper we investigate this problem in the case of deep convolutional neural networks in order to propose solutions that can mitigate influence of noise. The main contributions presented in this paper are experimental examination of influence of different types of noise on the convolutional neural network, proposition of a deep neural network operating as a denoiser, investigation of a deep network training with noise contaminated patterns, and finally an analysis of noise addition during the training process of a deep network as a form of regularization. Our main findings are construction of the deep network based denoising filter which outperforms state-of-the-art solutions, as well as proposition of a practical method of deep neural network training with noisy patterns for improvement against the noisy test patterns. All results are underpinned by experiments which show high efficacy and possibly broad applications of the proposed solutions.
Article
Full-text available
Virtual design and construction (VDC) implementation remains a challenge as companies lack understanding of the implementation strategies and their relation with other important improvement efforts such as lean management. This article presents a performance modeling methodology that allows companies to assess VDC implementation strategies, including lean management as a moderator. The methodology is based on a conceptual model of the implementation variables that influence project performance and a mathematical method that uses partial least squares to explain the relationships among the multiple variables. The methodology was tested using data from an existing survey to identify the variables and quantify the relationships. A significant finding is that using lean as a moderator strengthens the connection between strategies and allows a better performance on companies. The results are exploratory but provide interesting insights into VDC implementation strategies and provide evidence of the methodology's power.
Article
Full-text available
Probabilistic neural networks (PNNs) are artificial neural network algorithms widely used in pattern recognition and classification problems. In the traditional PNN algorithm, the probability density function (PDF) is approximated using the entire training dataset for each class. In some complex datasets, classmate clusters may be located far from each other and these distances between clusters may cause a reduction in the correct class's posterior probability and lead to misclassification. This paper presents a novel PNN algorithm, the competitive probabilistic neural network (CPNN). In the CPNN, a competitive layer ranks kernels for each class and an optimum fraction of kernels are selected to estimate the class-conditional probability. Using a stratified, repeated, random subsampling cross-validation procedure and 9 benchmark classification datasets, CPNN is compared to both traditional PNN and the state of the art (e.g. enhanced probabilistic neural network, EPNN). These datasets are examined with and without noise and the algorithm is evaluated with several ratios of training to testing data. In all datasets (225 simulation categories), performance percentages of both CPNN and EPNN are greater than or equivalent to that of the traditional PNN; in 73% of simulation categories, the CPNN analyses show modest improvement in performance over the state of the art.
Article
Full-text available
Scheduling problems involving physical machines and human resources are frequent in real production environments. In this paper, we tackle a problem in which a set of tasks must be performed on a set of machines under the assistance of human operators, subject to some constraints such as precedence relations on the tasks, limited capacity of machines and operators, and skills of the operators to assist the processing of tasks. We analyze the problem and propose a generic schedule builder that may be adapted to build schedules in different dominant and non-dominant search spaces. The schedule builder was exploited as a de- coder in a genetic algorithm. All the proposals were evaluated on a benchmark set with instances of different characteristics. The experimental study revealed useful insights of practical interest and showed substantial improvements of the genetic algorithm over existing methods in the literature.
Article
Full-text available
Physics-based models are intensively studied in mechanical and civil engineering but their constant increase in complexity makes them harder to use in a maintenance context, especially when degradation model can/should be updated from new inspection data. On the other hand, Markovian cumulative damage approaches such as Gamma processes seem promising; however, they suffer from lack of acceptability by the civil engineering community due to poor physics considerations. In this article, we want to promote an approach for modeling the degradation of structures and infrastructures for maintenance purposes which can be seen as an intermediate approach between physical models and probabilistic models. A new statistical, data-driven state-dependent model is proposed. The construction of the degradation model will be discussed within an application to the cracking of concrete due to chloride-induced corrosion. Numerical experiments will later be conducted to identify preliminary properties of the model in terms of statistical inferences. An estimation algorithm is proposed to estimate the parameters of the model in cases where databases suffer from irregularities.
Article
Full-text available
In this work, a novel self-organizing model called growing neural forest (GNF) is presented. It is based on the growing neural gas (GNG), which learns a general graph with no special provisions for datasets with separated clusters. On the contrary, the proposed GNF learns a set of trees so that each tree represents a connected cluster of data. High dimensional datasets often contain large empty regions among clusters, so this proposal is better suited to them than other self-organizing models because it represents these separated clusters as connected components made of neurons. Experimental results are reported which show the self-organization capabilities of the model. Moreover, its suitability for unsupervised clustering and foreground detection applications is demonstrated. In particular, the GNF is shown to correctly discover the connected component structure of some datasets. Moreover, it outperforms some well-known foreground detectors both in quantitative and qualitative terms.
Article
Full-text available
In highly automated human-machine systems, human operator functional state (OFS) prediction is an important approach to prevent accidents caused by operator fatigue, high mental workload, over anxiety, etc. In this paper, psychophysiological indices, i.e. heart rate, heart rate variability, task load index and engagement index recorded from operators who execute proceb control tasks are selected for OFS prediction. An adaptive differential evolution based neural network (ACADE-NN) is investigated. The behavior of ant colony foraging is introduced to self-adapt the control parameters of DE along with the mutation strategy at different evolution phases. The performance of ACADE is verified in the benchmark function tests. The designed ACADE-NN prediction model is used for estimation of the operator functional state. The empirical results illustrate that the proposed adaptive model is effective for most of the operators. The model outperforms the compared modeling methods and yields good generalization comparatively. It can describe the relationship between psychophysiological variables and OFS. It's applicable to abeb the operator functional state in safety-critical applications.
Article
Full-text available
This paper presents an overview of significant advances made in the emerging field of nature-inspired computing (NIC) with a focus on the physics- and biology-based approaches and algorithms. A parallel development in the past two decades has been the emergence of the field of computational intelligence (CI) consisting primarily of the three fields of neural networks, evolutionary computing and fuzzy logic. It is observed that NIC and CI intersect. The authors advocate and foresee more cross-fertilisation of the two emerging fields.
Article
Full-text available
Dealing with short-term deformations and the tension-stiffening effect in reinforced concrete (RC), the current study consists of two parts as presented in two separate manuscripts. Based on the test data of more than 300 RC ties, alternative tension-stiffening relationships of different complexity were proposed in the first paper (Part I). In the companion manuscript (Part II), a stochastic modeling technique for assessing the deformation response of RC elements, subjected to different combinations of tension and flexure, is proposed. Based on stochastic principles, this technique allows not only to predict the average deformation response, but also to establish bounds of these predictions that are of vital importance for practical problems. The proposed technique is verified with the help of independent test data in order to validate the accuracy of the predictions of deformation, using the tension-stiffening models proposed in Part I of the paper. Test specimens with different arrangements of steel or GFRP bars in the tensile zone were considered. The analysis has revealed that the influence of the degree of sophistication of the tension-stiffening models on the analysis results is smaller than the one of an adequate assessment of the shrinkage effect. The prediction accuracy is also related to the specific arrangement of reinforcement.
Article
Full-text available
User costs of different maintenance actions need to be assessed in road maintenance as well as the maintenance costs. The vehicle operating cost (VOC) and the travel delay cost are two major components of the user costs associated with road maintenance actions. This article simplifies the general calculation models of these two user cost components and develops a multiobjective Markov‐based model to minimize both maintenance cost and user cost subject to a number of constraints including the average annual budget limit and the performance requirement. The road deterioration process is modeled as a discrete‐time Markov process, the states of road performance are defined in terms of the road roughness, and the state transition probabilities are estimated considering the effects of deterioration and maintenance actions. An example is provided to illustrate the use of the proposed road maintenance optimization model. The results show that the optimal road maintenance plan obtained from the model is practical to implement and is cost‐effective compared with the periodical road maintenance plan. The results also indicate that the maintenance cost and the user cost are competitive. When maintenance works are carried out more frequently, the life‐cycle maintenance costs will increase while the life‐cycle user costs will decrease. This is because the VOC contributes the most amount of the user cost and its change has a contrary trend to the change of the maintenance cost over time.
Article
Full-text available
A multiparadigm general methodology is advanced for development of reliable, efficient, and practical freeway incident detection algorithms. The performance of the new fuzzy-wavelet radial basis function neural network (RBFNN) freeway incident detection model of Adeli and Karim is evaluated and compared with: the benchmark California algorithm #8 using both real and simulated data. The evaluation is based on three quantitative measures of detection rate, false alarm rate, and detection time, and the qualitative measure of algorithm portability. The new algorithm outperformed the California algorithm consistently under various scenarios. False alarms are a major hindrance to the widespread implementation of automatic freeway incident detection algorithms. The false alarm rate ranges from 0 to 0.07% for the new algorithm and from 0.53 to 3.82% for the California algorithm. The new fuzzy-wavelet RBFNN freeway incident detection model is a single-station pattern-based algorithm that is computationally efficient and requires no recalibration. The new model can be readily transferred without retraining and without my performance deterioration.
Article
Full-text available
A case-based reasoning (CBR) model is presented for freeway work zone traffic management. The model considers work zone layout, traffic demand, work characteristics, traffic control measures, and mobility impacts. A four-set case base schema or domain theory is developed to represent the cases based on the aforementioned characteristics of the problem. It includes a general information set, a problem description set, a solution (or control) description set, and an effects set. To improve the interactivity of the CBR system and its user-friendliness, a hierarchical object-oriented case model is developed for work zone traffic management. The model is implemented into an intelligent decision-support tool to assist traffic agencies in the development of work zone traffic control plans and to better design and manage work zones for increased mobility and safety. Three examples are presented to show the practical utility of the CBR system for work zone traffic management.
Article
A methodology is described for global and local health condition assessment of structural systems using ambient vibration response of the structure collected by sensors. The model incorporates synchrosqueezed wavelet transform, Fast Fourier Transform, and unsupervised deep Boltzmann machine to extract features from the frequency domain of the recorded signals. A probability density function is used to create a structural health index (SHI). This index can be used to assess both the global and local health conditions of the structure. A beauty of the proposed model is that it does not require costly experimental results to be obtained from a scaled version of the structure to simulate different damage states of the structure. Only ambient vibrations of the healthy structure are needed. In the absence of ambient vibrations, they can be simulated stochastically using structural properties and the probability theory. The effectiveness of the proposed model is illustrated employing experimental data obtained on a shake table in Hong Kong.
Article
Construction cost estimation, which is normally labor-intensive and error-prone, is one of the most important works concerned by multiparticipants during a project's life cycle. However, the proficiency of estimators on specifications for construction cost estimation greatly affects the efficiency and accuracy of cost estimation. By formalizing specifications for construction cost estimation , such specifications can be better implemented in computer programs so that the working efficiency and accuracy of estimation can be greatly improved. This study aims to establish an effective approach to formalize such specifications by using ontology. First, two typical specifications for construction cost estimation are analyzed and relevant models are established. Then, an ontology-based representation of two specifications is established based on the models. Next, a prototype tool for facilitating the establishment, modification, and extension of the ontology-based representation for estimators is presented and the use cases of the tool are illustrated. Finally, the applicability of the approach is discussed. It is concluded that the formalized representation can be used to classify the building components into items for bill of quantities and quota items automatically by computer programs to accelerate cost estimation.
Article
Finding construction components in cluttered point clouds is a critical pre-processing task that requires intensive and manual operations. Accurate isolation of an object from point clouds is a key for further processing steps such as positive identification, scan-to-BIM, and robotic manipulation. Manual isolaton is tedious, time consuming and disconnected from the automated tasks involved in the process. This paper adapts and examines a method for finding objects within 3D point clouds robustly, quickly, and automatically. A local feature on a pair of points is employed for representing 3D shapes. The method has three steps: (1) offline model library generation, (2) online searching and matching, and (3) match refinement and isolation. Experimental tests are carried out for finding industrial (curvilinear) and structural (rectilinear) elements. The method is verified under various circumstances in order to measure its performance toward addressing the major challenges involved in 3D object finding. Results show that the method is sufficiently quick and robust to be integrated with automated process control frameworks.
Article
Comprehensive and effective risk analysis is significant for studying construction simulation of diversion tunnel. Existing tunnel risk simulation approaches mainly analyze ordinary risk factors, and cannot quantitatively study risk events considering their causes. Additionally, in other tunnel probabilistic risk analysis methods, although some studies have made full probabilistic estimates of tunnel schedule, risk factors are unable to be studied considering cyclic construction characteristics and occurrence probability of risk events cannot be determined quantitatively. To address the issues, a probabilistic risk analysis approach of diversion tunnel construction simulation is proposed. Based on hierarchical simulation model, risk factors can be analyzed at operation level of tunnel construction. Moreover, Bayesian network is embedded into simulation program to quantitatively calculate probability of risk events in each simulation cycle, considering geology, design, construction, and management conditions and their mutual dependence.
Article
A novel model is presented for global health monitoring of large structures such as high-rise building structures through adroit integration of 2 signal processing techniques, synchrosqueezed wavelet transform and fast Fourier transform, an unsupervised machine learning technique, the restricted Boltzmann machine, and a recently developed supervised classification algorithm called neural dynamics classification (NDC) algorithm. The model extracts hidden features in the frequency domain of the denoised measured response signals recorded by sensors on different elevations or floors of a structure. The extracted features are used as an input of the NDC to detect and classify the global health of the structure into categories such as healthy, light damage, moderate damage, severe damage, and near collapse. The proposed model is validated using the data obtained from a 3D 1:20 scaled 38-story reinforced concrete building structure. The results are compared with 3 other supervised classification algorithms: k-nearest neighbor (KNN), probabilistic neural networks (PNN), and enhanced PNN (EPNN). NDC, EPNN, PNN, and KNN yield maximum average accuracies of 96%, 94%, 92%, and 82%, respectively.
Article
An Earthquake Early Warning System (EEWS) can save lives. It can also be used to manage the critical lifeline infrastructure and essential facilities. Recent research on earthquake prediction towards the development of an EEWS can be classified into two groups based on a) arrival of P waves and b) seismicity indicators. The first approach can provide warnings within a timeframe of seconds. A seismicity indicator-based EEWS is intended to forecast major earthquakes within a time frame of weeks. In this paper, a novel seismicity indicator-based EEWS model, called neural EEWS (NEEWS), is presented for forecasting the earthquake magnitude and its location weeks before occurrence using a combination of a classification algorithm (CA) based on machine learning concepts and a mathematical optimization algorithm (OA). The role of the CA is to find whether there is an earthquake in a given time period greater than a predefined magnitude threshold and the role of the OA is to find the location of that earthquake with the maximum probability of occurrence. The model is tested using earthquake data in southern California with a combination of four CAs and one OA to find the best EEWS model. The proposed model is capable of predicting strong disastrous events as long as sufficient data are available for such events. The paper provides a novel solution to the complex problem of earthquake prediction through adroit integration of a machine learning classification algorithm and the robust neural dynamics optimization algorithm of Adeli and Park.
Article
Various challenging constraints must be satisfied in railway alignment design for topographically complex mountainous regions. The alignment design for such environments is so challenging that existing methodologies have great difficulties in automatically generating viable railway alignment alternatives. We solve this problem with a hybrid method in which a bidirectional distance transform (DT) algorithm automatically selects control points before a genetic algorithm (GA) refines the alignment. This approach solves the problems of (1) determining the appropriate distribution of control points in the GA and (2) producing alignments that deviate significantly from the DT-optimized paths. Automatic design of backtracking curves and dynamic generation of vertical points of intersection handling multiple constraints are developed to improve the GA performance. This method has been applied to a real case on the Sichuan-Tibet Railway where excessively severe natural gradients must be overcome. It automatically finds an alignment optimized for the given objectives and complex constraints, as well as various promising alternatives.
Article
We investigate two modified Quantum Evolutionary methods for solving real value problems. The Quantum Inspired Evolutionary Algorithms (QIEA) were originally used for solving binary encoded problems and their signature features follow superposition of multiple states on a quantum bit and a rotation gate. In order to apply this paradigm to real value problems, we propose two quantum methods Half Significant Bit (HSB) and Stepwise Real QEA (SRQEA), developed using binary and real encoding respectively, while keeping close to the original quantum computing metaphor.We evaluate our approaches against sets of multimodal mathematical test functions and real world problems, using five performance metrics and include comparisons to published results. We report the issues encountered while implementing some of the published real QIEA techniques. Our methods focus on introducing and implementing new rotation gate operators used for evolution, including a novel mechanism for preventing premature convergence in the binary algorithm. The applied performance metrics show superior results for our quantum methods on most of the test problems (especially for the more complex and challenging ones), demonstrating faster convergence and accuracy.
Article
In the past decade, infrastructure-related legislation in the United States has consistently emphasized the need to measure the risk and uncertainty associated with infrastructure project cost estimates. Such cost variability is best viewed from the perspective of the project development phases of planning, design, bidding, and construction and how the project cost estimate changes as it evolves across these phases. At the planning phase, a rough cost estimate is established. Additional scope and design information become available the project proceeds through its subsequent phases, enabling progressive refinement of the project cost estimate. The literature addressing cost estimate deviations between the project development phases is dominated by comparisons of estimates across two specific phases: the final (as-built) project cost and the letting (pre-construction) cost. Secondly, past research has mainly focused on overruns while largely ignoring underruns. Thirdly, the literature is replete with information about the cost deviation risks associated with the factors known at the design and construction phases while very little has focused on the factors known at the early phases of project development. In an attempt to address these gaps, this paper introduces a methodology that uses risk-based multinomial discrete-outcome models and Monte Carlo simulation involving random draws. The methodology uses these tools to predict the probability that a project of known characteristics at the planning phase will follow a particular cost escalation pathway across the project development phases and that it will incur a given level of cost deviation severity. The paper then uses historical data on highway project costs, estimated at each phase of their development, to demonstrate how infrastructure agencies could apply the proposed methodology. The vector of the predictors includes the infrastructure type, functional class, project size, administrative district, and time elapsed between each successive phase-pairs of the project development process. Then statistical models are developed to estimate the probabilities that a highway project will follow a specific cost escalation pathway with a given direction and severity of cost deviation. Infrastructure management analysts can use the developed methodology to identify which projects are likely to experience a particular pathway for cost escalation, and the direction (deviation) and severity of the cost deviations. Overall, this methodology can help agencies measure the risk of cost deviation associated with each project and to develop more realistic project contingency cost estimates.
Article
Imbalanced classification is related to those problems that have an uneven distribution among classes. In addition to the former, when instances are located into the overlapped areas, the correct modeling of the problem becomes harder. Current solutions for both issues are often focused on the binary case study, as multi-class datasets require an additional effort to be addressed. In this research, we overcome these problems by carrying out a combination between feature and instance selections. Feature selection will allow simplifying the overlapping areas easing the generation of rules to distinguish among the classes. Selection of instances from all classes will address the imbalance itself by finding the most appropriate class distribution for the learning task, as well as possibly removing noise and difficult borderline examples. For the sake of obtaining an optimal joint set of features and instances, we embedded the searching for both parameters in a Multi-Objective Evolutionary Algorithm, using the C4.5 decision tree as baseline classifier in this wrapper approach. The multi-objective scheme allows taking a double advantage: the search space becomes broader, and we may provide a set of different solutions in order to build an ensemble of classifiers. This proposal has been contrasted versus several state-of-the-art solutions on imbalanced classification showing excellent results in both binary and multi-class problems.
Article
Radio frequency identification (RFID) helps improve supply chain efficiency by providing item-level identification and real-time information. Today, barcode continues to be the main identification technology for precast construction applications. In this research, we investigate the data-driven mechanisms and benefits of utilizing RFID in knowledge-based precast construction supply chains. With computer-aided self-learning capability, we simulate three models for manual-, barcode-, and RFID-enabled precast construction supply chain. The results of 100 precast wall-panel construction in a two-echelon precast construction supply chain reveal that the knowledge-based RFID system could generate 62.0% saving of operational costs, which is 29.0% higher than that of a barcode-based system. As a result, the computer-aided adaptive learning mechanism based on RFID is verifiable to improve the overall operational performance by reducing lead time, operational errors, and costs. Due to the lack of existing literature of data technology utilization in the precast construction industry, our findings in this research could improve the decision making regarding technology selection, as well as help with the operationalization of RFID and transformation to intelligent precast construction management in big data environment.
Article
In this research, a novel family of learning rules called Beta Hebbian Learning (BHL) is thoroughly investigated to extract information from high-dimensional datasets by projecting the data onto low-dimensional (typically two dimensional) subspaces, improving the existing exploratory methods by providing a clear representation of data's internal structure. BHL applies a family of learning rules derived from the Probability Density Function (PDF) of the residual based on the beta distribution. This family of rules may be called Hebbian in that all use a simple multiplication of the output of the neural network with some function of the residuals after feedback. The derived learning rules can be linked to an adaptive form of Exploratory Projection Pursuit and with artificial distributions, the networks perform as the theory suggests they should: the use of different learning rules derived from different PDFs allows the identification of "interesting" dimensions (as far from the Gaussian distribution as possible) in high-dimensional datasets. This novel algorithm, BHL, has been tested over seven artificial datasets to study the behavior of BHL parameters, and was later applied successfully over four real datasets, comparing its results, in terms of performance, with other well-known Exploratory and projection models such as Maximum Likelihood Hebbian Learning (MLHL), Locally-Linear Embedding (LLE), Curvilinear Component Analysis (CCA), Isomap and Neural Principal Component Analysis (Neural PCA).
Article
An appropriate design of work-rest schedule is recognized as an efficient way in providing better ergonomic environment, improving labor productivity as well as safety. Construction workers usually undertake physically demanding tasks in an outdoor environment, with awkward postures and repetitive motions. This study proposes a mixed-integer linear programming approach to optimize the work-rest schedule for construction workers in hot weather for the objective of maximizing the total productive time. The model takes into consideration the physical and physiological conditions of the workers, the working environment, the nature of the jobs and the minimum rest duration of the government regulation. The results of numerical experiments show that the proposed model outperforms a default work-rest schedule by up to 10% in terms of total productive time. This implies considerable cost savings for the construction industry.
Article
Training of large scale neural networks, like those used nowadays in Deep Learning schemes, requires long computational times or the use of high performance computation solutions like those based on cluster computation, GPU boards, etc. As a possible alternative, in this work the Back-Propagation learning algorithm is implemented in an FPGA board using a multiplexing layer scheme, in which a single layer of neurons is physically implemented in parallel but can be reused any number of times in order to simulate multi-layer architectures. An on-chip implementation of the algorithm is carried out using a training/validation scheme in order to avoid overfitting effects. The hardware implementation is tested on several configurations, permitting to simulate architectures comprising up to 127 hidden layers with a maximum number of neurons in each layer of 60 neurons. We confirmed the correct implementation of the algorithm and compared the computational times against C and Matlab code executed in a multicore supercomputer, observing a clear advantage of the proposed FPGA scheme. The layer multiplexing scheme used provides a simple and flexible approach in comparison to standard implementations of the Back-Propagation algorithm representing an important step towards the FPGA implementation of deep neural networks, one of the most novel and successful existing models for prediction problems.
Article
Construction laborer assignment is the assignment of laborers in a team to the tasks for a daily work in a construction project. This study proposes a mathematical programming approach to examine the optimal construction laborer assignment problem with the objectives of productivity and occupational health and safety while considering equity between laborers. Several linearization techniques are presented to transform the mathematical programming model into a mixed-integer linear program, which can be solved by off-the-shelf mixed-integer linear solvers. The proposed model is applied to extensive numerical experiments and the results show that the mathematical programming approach outperforms a conventional heuristic by over 10% in terms of job completion time. This implies considerable overhead,equipment, and manpower savings for the construction industry.
Article
Researchers often rely on simple methods to identify involvement of neurons in a particular motor task. The historical approach has been to inspect large groups of neurons and subjectively separate neurons into groups based on the expertise of the investigator. In cases where neuron populations are small it is reasonable to inspect these neuronal recordings and their firing rates carefully to avoid data omissions. In this paper, a new methodology is presented for automatic objective classification of neurons recorded in association with behavioral tasks into groups. By identifying characteristics of neurons in a particular group, the investigator can then identify functional classes of neurons based on their relationship to the task. The methodology is based on integration of a multiple signal classification (MUSIC) algorithm to extract relevant features from the firing rate and an expectation-maximization Gaussian mixture algorithm (EM-GMM) to cluster the extracted features. The methodology is capable of identifying and clustering similar firing rate profiles automatically based on specific signal features. An empirical wavelet transform (EWT) was used to validate the features found in the MUSIC pseudospectrum and the resulting signal features captured by the methodology. Additionally, this methodology was used to inspect behavioral elements of neurons to physiologically validate the model. This methodology was tested using a set of data collected from awake behaving non-human primates.
Article
The efficient use of resources is a key factor to minimize the cost while meeting time deadlines and quality requirements; this is especially important in construction projects where field operations make fluctuations of resources unproductive and costly. Resource Leveling Problems (RLP) aim to sequence the construction activities that maximize the resource consumption efficiency over time, minimizing the variability. Exact algorithms for the RLP have been proposed throughout the years to offer optimal solutions; however, these problems require a vast computational capability (“combinatorial explosion”) that makes them unpractical. Therefore, alternative heuristic and metaheuristic algorithms have been suggested in the literature to find local optimal solutions, using different libraries to benchmark optimal values; for example, the Project Scheduling Problem LIBrary (PSPLIB) for minimal lags is still open to be solved to optimality for RLP. To partially fill this gap, the authors propose a Parallel Branch and Bound algorithm for the RLP with minimal lags to solve the RLP with an acceptable computational effort. This way, this research contributes to the body of knowledge of construction project scheduling providing the optimums of 50 problems for the RLP with minimal lags for the first time, allowing future contributors to benchmark their heuristics methods against exact results by obtaining the distance of their solution to the optimal values. Furthermore, for practitioners, the time required to solve this kind of problems is reasonable and practical, considering that unbalanced resources can risk the goals of the construction project.
Article
Real-world problems often involve the optimisation of multiple conflicting objectives. These problems, referred to as multi-objective optimisation problems, are especially challenging when more than three objectives are considered simultaneously. This paper proposes an algorithm to address this class of problems. The proposed algorithm is an evolutionary algorithm based on an evolution strategy framework, and more specifically, on the Covariance Matrix Adaptation Pareto Archived Evolution Strategy (CMA-PAES). A novel selection mechanism is introduced and integrated within the framework. This selection mechanism makes use of an adaptive grid to perform a local approximation of the hypervolume indicator which is then used as a selection criterion. The proposed implementation, named Covariance Matrix Adaptation Pareto Archived Evolution Strategy with Hypervolumesorted Adaptive Grid Algorithm (CMA-PAES-HAGA), overcomes the limitation of CMA-PAES in handling more than two objectives and displays a remarkably good performance on a scalable test suite in five, seven, and ten-objective problems. The performance of CMA-PAES-HAGA has been compared with that of a competition winning meta-heuristic, representing the state-of-the-art in this sub-field of multi-objective optimisation. The proposed algorithm has been tested in a seven-objective real-world application, i.e. the design of an aircraft lateral control system. In this optimisation problem, CMA-PAES-HAGA greatly outperformed its competitors.
Article
Mathematical morphology (MM) is a popular formalism largely used for image processing. Of particular relevance in MM-based operations is the structuring element (SE). In an image processing environment SE defines which pixel values, in the input image, to include in the calculation of the output value. Most MM-based image processing environments employ limited size SEs which prevents their use in tasks requiring larger SEs. This paper proposes a computer-based method for optimizing the decomposition of SEs, in binary image related tasks, that employs binary MM, which automatically transforms an original SE into a corresponding sequence of 3 × 3 SEs. The decomposition operation reduces the complexity of the morphological operations and has been implemented as a genetic algorithm (GA) based process, that searches for the best sequence of smaller structuring elements, using one dilation and four union operations, for the decomposition of each large-sized structuring element. By using a GA with a fixed-length chromosome as well as a fixed number of dilation and union operations, the method has a simple and fixed structure, which makes it a convenient choice for hardware implementations. Its performance, based on six images already used in the literature by other well-established method, has shown to be competitive.
Article
The performance and serviceability of structural systems during their lifetime can be significantly affected by the occurrence of extreme events. Despite their low probability, there is a potential for multiple occurrences of such hazards during the relatively long service life of systems. This paper introduces a comprehensive framework for the assessment of lifecycle cost of infrastructures subject to multiple hazard events throughout their decision-making time horizon. The framework entails the lifecycle costs of maintenance and repair, as well as the salvage value of the structure at the end of the decision-making time horizon. The primary features of the proposed framework include accounting for the possibility of multiple hazard occurrences, incorporating effects of incomplete repair actions on the accumulated damage through damage state-dependent repair times, and requiring limited resources in terms of input data and computational costs. A dynamic programming procedure is proposed to calculate the expected damage condition of the structure for each possibility of the number of hazard incidents based on state-dependent fragility curves. The proposed framework is applied to a moment-frame building located in a region with high seismicity, and lifecycle costs are evaluated for six retrofit plans. The results displayed variation in the ranking of the retrofit actions with respect to decision-making time horizon. Furthermore, the sensitivity analyses demonstrated that disregarding repair time in the lifecycle cost analysis can result in false identification of unsafe retrofit actions as optimal and reliable strategies.
Article
A novel technique of quantitative EEG for differentiating patients with early-stage Creutzfeldt-Jakob disease (CJD) from other forms of rapidly progressive dementia (RPD) is proposed. The discrimination is based on the extraction of suitable features from the time-frequency representation of the EEG signals through continuous wavelet transform (CWT). An average measure of complexity of the EEG signal obtained by permutation entropy (PE) is also included. The dimensionality of the feature space is reduced through a multilayer processing system based on the recently emerged deep learning (DL) concept. The DL processor includes a stacked auto-encoder, trained by unsupervised learning techniques, and a classifier whose parameters are determined in a supervised way by associating the known category labels to the reduced vector of high-level features generated by the previous processing blocks. The supervised learning step is carried out by using either support vector machines (SVM) or multilayer neural networks (MLP-NN). A subset of EEG from patients suffering from Alzheimer's Disease (AD) and healthy controls (HC) is considered for differentiating CJD patients. When fine-tuning the parameters of the global processing system by a supervised learning procedure, the proposed system is able to achieve an average accuracy of 89%, an average sensitivity of 92%, and an average specificity of 89% in differentiating CJD from RPD. Similar results are obtained for CJD versus AD and CJD versus HC.
Article
Effective common spatial pattern (CSP) feature extraction for motor imagery (MI) electroencephalogram (EEG) recordings usually depends on the filter band selection to a large extent. Subband optimization has been suggested to enhance classification accuracy of MI. Accordingly, this study introduces a new method that implements sparse Bayesian learning of frequency bands (named SBLFB) from EEG for MI classification. CSP features are extracted on a set of signals that are generated by a filter bank with multiple overlapping subbands from raw EEG data. Sparse Bayesian learning is then exploited to implement selection of significant features with a linear discriminant criterion for classification. The effectiveness of SBLFB is demonstrated on the BCI Competition IV IIb dataset, in comparison with several other competing methods. Experimental results indicate that the SBLFB method is promising for development of an effective classifier to improve MI classification.
Article
Computer aided diagnosis (CAD) constitutes an important tool for the early diagnosis of Alzheimer’s Disease (AD), which, in turn, allows the application of treatments that can be simpler and more likely to be effective. This paper explores the construction of classification methods based on deep learning architectures applied on brain regions defined by the Automated Anatomical Labelling (AAL). Gray Matter (GM) images from each brain area have been split into 3D patches according to the regions defined by the AAL atlas and these patches are used to train different deep belief networks. An ensemble of deep belief networks is then composed where the final prediction is determined by a voting scheme. Two deep learning based structures and four different voting schemes are implemented and compared, giving as a result a potent classification architecture where discriminative features are computed in an unsupervised fashion. The resulting method has been evaluated using a large dataset from the Alzheimer's disease Neuroimaging Initiative (ADNI). Classification results assessed by cross-validation prove that the proposed method is not only valid for differentiate between controls (NC) and AD images, but it also provides good performances when tested for the more challenging case of classifying Mild Cognitive Impairment Subjects (MCI). In particular, the classification architecture provides accuracy values up to 0.90 and AUC of 0.95 for NC/AD classification, 0.84 and AUC of 0.91 for stable MCI /AD classification and 0.83 and AUC of 0.95 for NC/MCI converters classification.
Article
This paper investigates the potential of using a polynomial radial basis function (RBF) neural network to extract the shoreline position from coastal video images. The basic structure of the proposed network encompasses a standard RBF network module, a module of nodes that use Chebyshev polynomials as activation functions, and an inference module. The experimental setup is an operational coastal video monitoring system deployed in two sites in Southern Europe to generate variance coastal images. The histogram of each image is approximated by non-linear regression, and associated with a manually extracted intensity threshold value that quantifies the shoreline position. The key idea is to use the set of the resulting regression parameters as input data, and the intensity threshold values as output data of the network. In summary, the data set is extracted by quantifying the qualitative image information, and the proposed network takes the advantage of the powerful approximation capabilities of the Chebyshev polynomials by utilizing a small number of coefficients. For comparative reasons, we apply a polynomial RBF network trained by fuzzy clustering, and a feed-forward neural network trained by the back propagation algorithm. The comparison criteria used are the standard mean square error; the data return rates, and the root mean square error of the cross-shore shoreline position, calculated against the shorelines extracted by the aforementioned annotated threshold values. The main conclusions of the simulation study are: (a) the proposed method outperforms the other networks, especially in extracting the shoreline from images used as testing data; (b) for higher polynomial orders it obtains data return rates greater than 84%, and the root mean square error of the cross-shore shoreline position is less than 1.8 meters.
Article
This paper presents a review of the recent literature on sustainability in construction and design with a focus on highrise buildings. The paper is divided into the following main sections: energy consumption, environmental effects and green practices for highrise buildings. A number of concepts in sustainable design are reviewed including passive solar design, renewable energy resources, cogeneration and tri-generations, embodied energy reduction, net zero energy building, carbon emission reduction, envelope environment quality, green materials, efficient mechanical design and innovative structural systems. Their applications in a dozen signature and iconic structures are described. In order to achieve net zero energy in a new highrise building, first, multiple green solutions need to be evaluated using two categorical solutions: passive solar and envelop environment design and renewable energy resources along with efficient energy generators. Next, a robust optimization algorithm should be used to select the optimum set of solutions. This is worth pursuing in future sustainable design of highrise buildings because they are massive and complex structures with many components. Copyright
Article
This paper proposes an evolutionary computation based approach for solving resource leveling optimization problems in project management. In modern management engineering, problems of this kind refer to the optimal handling of available resources in a candidate project and have emerged, as the result of the even increasing needs of project managers in facing project complexity, controlling related budgeting and finances and managing the construction production line. Standard approaches, such as exhaustive or greedy search methodologies, fail to provide near-optimum solutions in feasible time even for small scale problems, whereas intelligent approaches manage to quickly reach high quality near-optimal solutions. In this work, a new genetic algorithm is proposed which investigates the start time of the non-critical activities of a project, in order to optimally allocate its resources. The innovation of the proposed approach is related to certain genetic operations applied like crossover for the improvement of the solution quality from generation to generation. The presentation and performance comparison of all multi-objective functions for resource leveling that are available in literature is another interesting part of this work. Detailed experiments with small and medium size benchmark problems taken from publicly available project data resources produce highly accurate resource profiles. As shown in the experimental results, the proposed methodology proves capable of coping even with large size project management problems without the need to divide the original problem to sub-problems due to complexity.
Article
This paper focuses on automatic fuzzy clustering problem and proposes a novel automatic fuzzy clustering method that employs an extended membrane system with active membranes that has been designed as its computing framework. The extended membrane system has a dynamic membrane structure; since membranes can evolve, it is particularly suitable for processing the automatic fuzzy clustering problem. A modification of a differential evolution (DE) mechanism was developed as evolution rules for objects according to membrane structure and object communication mechanisms. Under the control of both the object's evolution-communication mechanism and the membrane evolution mechanism, the extended membrane system can effectively determine the most appropriate number of clusters as well as the corresponding optimal cluster centers. The proposed method was evaluated over 13 benchmark problems and was compared with four state-of-the-art automatic clustering methods, two recently developed clustering methods and six classification techniques. The comparison results demonstrate the superiority of the proposed method in terms of effectiveness and robustness.
Article
Predicting the price of housing is of paramount importance for near-term economic forecasting of any nation. This paper presents a novel and comprehensive model for estimating the price of new housing in any given city at the design phase or beginning of the construction through ingenious integration of a deep belief restricted Boltzmann machine and a unique nonmating genetic algorithm. The model can be used by construction companies to gauge the sale market before they start a new construction and consider to build or not to build. An effective data structure is presented that takes into account a large number of economic variables/indices. The model incorporates time-dependent and seasonal variations of the variables. Clever stratagems have been developed to overcome the dimensionality curse and make the solution of the problem amenable on standard workstations. A case study is presented to demonstrate the effectiveness and accuracy of the model.
Book
In nonprofit managerial accounting, cost containment is important if the organization is to remain viable. Cost performance needs to be evaluated to determine how efficiently services are being provided. A nonprofit manager needs to provide services to their patrons without a solid understanding of how costs are incurred and measured that cannot be done. The book helps in providing guidance in this area.
Article
Highway reconstruction consumes large amounts of energy and material resources and at the same time produces significant quantities of emissions during material processing, transportation, and site construction. Sustainable highway reconstruction requires strategies that optimally utilize recycled or virgin material from supply locations (e.g., existing roadway, material markets), assign material use in reconstruction process, select fixed staging area(s) and mobile unit location(s) to process material, and ship the material to the destinations (e.g., markets, landfills, highway construction sites). We present a decision support system based on a network optimization model that determines (i) optimal locations of fixed staging areas and mobile processing units and (ii) optimal material recycling and shipment strategies that minimize the total cost for material procurement, transportation, staging area and mobile processing unit investment, and CO2 equivalent (CO2e) emissions. A hypothetical case study of a 35‐mile reconstruction project was conducted to test the performance of the model and to draw insights on how different system parameters influence the optimal staging locations, recycled material use, and traffic management plan. It was found that the use of mobile processors and/or recycled material can have significant impacts on hauling costs, emissions, and optimal locations of staging areas.
Article
Semiconductor hookup construction (i.e., constructing process tool piping systems) is critical to semiconductor fabrication plant completion. During the conceptual project phase, it is difficult to conduct an accurate cost estimate due to the great amount of uncertain cost items. This study proposes a new model for estimating semiconductor hookup construction project costs. The developed model, called FALCON‐COST, integrates the component ratios method, fuzzy adaptive learning control network (FALCON), fast messy genetic algorithm (fmGA), and three‐point cost estimation method to systematically deal with a cost‐estimating environment involving limited and uncertain data. In addition, the proposed model improves the current FALCON by devising a new algorithm to conduct building block selection and random gene deletion so that fmGA operations can be implemented in FALCON. The results of 54 case studies demonstrate that the proposed model has estimation accuracy of 83.82%, meaning it is approximately 22.74%, 23.08%, and 21.95% more accurate than the conventional average cost method, component ratios method, and modified FALCON‐COST method, respectively. Providing project managers with reliable cost estimates is essential for effectively controlling project costs.
Article
Following an extensive literature review it was established that professional subjective judgement and regression analysis were the two main techniques utilized for predicting the seismic retrofit construction cost. The study presented here aims at predicting this cost by employing a more advanced modelling technique, known as the artificial neural network (ANN) methodology. Using this methodology, a series of non-parametric ANN models was developed based on significant predictors of the retrofit net construction cost (RNCC). Data on these predictors, together with the RNCC, were collected from 158 earthquake-prone public school buildings, each having a framed structure. A novel systematic framework was proposed with the aim to increase the generalization ability of ANN models. Using this framework, the values of critical components involved in the design of ANN models were defined. These components included the number of hidden layers and neurons, and learning parameters in terms of learning rate and momentum. The sensitivity of the developed ANN models to these components was examined and it was found that the predictive performance of these models was more influenced by the number of hidden neurons than by the value of learning parameters. Also, the results of this examination revealed that the overlearning problem became more serious with an increase in the number of predictors. In addition to the framework proposed for the successful development of ANN models, the primary contribution of this study to the construction industry is the introduction of “building total area” as the key predictor of the RNCC. This predictor enables a reliable estimation of the RNCC to be made at the early development stage of a seismic retrofit project when little information is known about the project. Read More: http://ascelibrary.org/doi/abs/10.1061/(ASCE)CO.1943-7862.0000725
Article
Estimation of the cost of a construction project is an important task in the management of construction projects. The quality of construction management depends on accurate estimation of the construction cost. Highway construction costs are very noisy and the noise is the result of many unpredictable factors. In this paper, a regularization neural network is formulated and a neural network architecture is presented for estimation of the cost of construction projects. The model is applied to estimate the cost of reinforced-concrete pavements as an example. The new computational model is based on a solid mathematical foundation making the cost estimation consistently more reliable and predictable. Further, the result of estimation from the regularization neural network depends only on the training examples. It does not depend on the architecture of the neural network, the learning parameters, and the number of iterations required for training the system. Moreover, the problem of noise in the data is taken into account in a rational manner.