Conference Paper

Analysis of pressure measurements in buildings

Authors:
To read the full-text of this research, you can request a copy directly from the authors.

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the authors.

ResearchGate has not been able to resolve any citations for this publication.
Article
Full-text available
This case study investigates the effects of ventilation intervention on measured and perceived indoor air quality (IAQ) in a repaired school where occupants reported IAQ problems. Occupants’ symptoms were suspected to be related to the impurities leaked indoors through the building envelope. The study’s aim was to determine whether a positive pressure of 5–7 Pa prevents the infiltration of harmful chemical and microbiological agents from structures, thus decreasing symptoms and discomfort. Ventilation intervention was conducted in a building section comprising 12 classrooms and was completed with IAQ measurements and occupants’ questionnaires. After intervention, the concentration of total volatile organic compounds (TVOC) and fine particulate matter (PM2.5) decreased, and occupants’ negative perceptions became more moderate compared to those for other parts of the building. The indoor mycobiota differed in species composition from the outdoor mycobiota, and changed remarkably with the intervention, indicating that some species may have emanated from an indoor source before the intervention.
Conference Paper
Full-text available
For the design of naturally ventilated buildings, information of air speed at the openings of a building is important. However, the only data set usually available to designers is meteorological data, such as wind speed and direction measured at weather stations. This paper explores the ratio of air speed at building openings to the wind speed measured at weather stations. Meteorological data from three weather stations as well as air velocity that was obtained through full-scale physical measurements were used in this study. The results showed that air speed at building openings was about half of the wind speed recorded at the closest station to the case study. This ratio reduced to approximately 30% when comparing to the weather stations located in greater distance and more open areas. Given that air speed at the openings has a direct relation to the ventilation rate, employing these ratios to the available weather data when designing for natural ventilation, can provide more realistic picture of natural ventilation performance.
Article
Full-text available
Structures with a large number of embedded sensors are becoming more common, and this refined spatial information can be used to advantage in damage location and model validation. These sensors could be accelerometers, strain gauges, piezoceramic patches, PVDF film sensors, or optical fibre sensors. This approach requires that the sensors are function correctly, which on a smart structure operating in the field should be continuous and automatically monitored. This paper considers possible approaches to sensor validation, based on the assumption that a model of the structure is available. The aim is to make use of the natural data redundancy since there will more sensors than modes in the data. The validation approaches considered are based on hypothesis testing based on a number of techniques, such as modal filtering. The methods are demonstrated on simple examples that exercise their strengths and weaknesses.
Article
Full-text available
Structures with a large number of sensors and actuators are becoming more common, and their applications vary from active control to damage location. This large amount of spatial information should be used to advantage to continuously monitor the correct functioning of the sensors during normal operation. Errors introduced by faulty sensors can cause a loss of performance and erroneous conclusions, and this paper analyses additive sensor faults. Two residual generation schemes are proposed to monitor sensor faults, namely the modal filtering approach and the so-called Parity Space approach. These residuals are then tested using a probabilistic approach using a 2 test to determine if there is a faulty sensor. These approaches are demonstrated on a simulated cantilevered beam excited at its tip and also on an experimental subframe structure.
Book
Exploratory data analysis (EDA) is a strategy of data analysis that emphasizes maintaining an open mind to alternative possibilities. EDA is a philosophy or an attitude about how data analysis should be carried out, rather than being a fixed set of techniques. It is difficult to obtain a clear-cut answer from “messy” human phenomena, and thus the exploratory character of EDA is very suitable to psychological research. This research tradition was founded by John Tukey, who often relates EDA to detective work. In EDA, the role of the researcher is to explore the data in as many ways as possible until a plausible “story” emerges. A detective does not collect just any information. Instead, he or she collects clues related to the central question of the case. By the same token, EDA is not “fishing” or “torturing” the data set until it confesses. Rather, it is a systematic way to investigate relevant information from multiple perspectives. Tukey emphasizes the role of data analysis in research, rather than mathematics, statistics, and probability. Mathematics is secondary in the sense that it is a tool for understanding the data. Classical statistics aims to infer from the sample to the population based on the probability as the relative frequency in the long run. However, in many stages of inquiry, the working questions are non-probabilistic and the focal point should be the data at hand rather than the probabilistic inference in the long run. Hence, prematurely adopting a specific statistical model would hinder the researchers from considering different possible solutions. Because EDA endorses open-mindedness and triangulation, it is not a standalone approach. Rather, it complements traditional confirmatory data analysis (CDA) by generating a working hypothesis, as well as spotting outliers and assumption violations that might invalidate CDA. Additionally, it can also be operated with Bayesian statistics and resampling side by side. With the advent of high-power computers and voluminous data, many exploratory techniques have been developed in data science. These methods are known as data mining. Because it is tedious or even impossible to detect the data patterns when the sample size is extremely large or there are too many variables (this problem is called the “curse of dimensionality”), some data miners use machine learning to explore alternate routes for understanding the data. There are different taxonomies of EDA. Traditionally, EDA comprises residual analysis, data re-expression, resistant procedures, and data visualization. With the advance of high-power computing and big data analytics, the alternate taxonomy is goal oriented, namely, clustering, variable screening, and pattern recognition.
Article
Structural health monitoring (SHM) technique is increasingly used in civil engineering structures, with which the authentic environmental and structural response data can be obtained directly. To get accurate structural condition assessment and damage detection, it is important to make sure the monitoring system is robust and the sensors are functioning properly. When sensor fault occurs, data cannot be correctly acquired at the faulty sensor(s). In such situations, approaches are needed to help reconstruct the missing data. This paper presents an investigation on wind pressure monitoring of a super-tall structure of 600. m high during a strong typhoon, aiming to compare the performance of data reconstruction using two different neural network (NN) techniques: back-propagation neural network (BPNN) and generalized regression neural network (GRNN). The early stopping technique and the Bayesian regularization technique are introduced to enhance the generalization capability of the BPNN. The field monitoring data of wind pressure collected during the typhoon are used to formulate the models. In the verification, wind pressure time series at faulty sensor location are reconstructed by using the monitoring data acquired at the adjacent sensor locations. It is found that the NN models perform satisfactorily in reconstructing the missing data, among which the BPNN model adopting Bayesian regularization (BR-BPNN) performs best. The reconstructed wind pressure dataset has maximum root mean square error about 23.4. Pa and minimum correlation coefficient about 0.81 in reference to the field monitoring data. It is also shown that the reconstruction capability of the NN models decreases as the faulty sensor location moves from center to corner of the sensor array. While the BR-BPNN model performs best in reconstructing the missing data, it takes the longest computational time in model formulation.
Article
In dynamic measurements, the sensors may have both spatial and temporal correlation which can be utilized to detect, isolate, and correct the faulty sensor. The method is based on the missing data analysis using the time history data. Using the temporal correlation is justified if the number of active structural modes is larger than the number of sensors. The disadvantages of the temporal model are: 1) it involves defining an additional parameter, the model order; 2) the computational effort increases, and 3) the process must be stationary. Experimental multichannel acceleration measurements were used to verify the proposed method. The method is compared with that using the spatial correlation only.
Article
Three data-based techniques for sensor validation are studied: the minimum mean square error (MMSE) estimation, the principal component analysis (PCA), and the factor analysis (FA). In all methods a single sensor is estimated using the remaining ones. It is shown that MMSE outperforms the other two, while PCA has the lowest performance. MMSE has no input parameters to be defined, whereas both PCA and FA include one parameter, the number of principal components or factors. PCA is observed to be more sensitive to this parameter than FA. FA is computationally the least efficient due to an iterative algorithm. The methods are compared using experimental vibration measurements of a bridge. Different sensor faults are studied and the performance is assessed from the capability to detect a sensor fault, and to identify and reconstruct a faulty sensor.
Article
Discrimination between three different sources of variability in a vibration-based structural health monitoring system is investigated: environmental or operational effects, sensor faults, and structural damage. Separating the environmental or operational effects from the other two is based on the assumption that measurements under different environmental or operational conditions are included in the training data. Distinguishing between sensor fault and structural damage utilizes the fact that the sensor faults are local, while structural damage is global. By localizing the change to a sensor which is then removed from the network, the two different influences can be separated. The sensor network is modelled as a Gaussian process and the generalized likelihood ratio test (GLRT) is then used to detect and localize a change in the system. A numerical and an experimental study are performed to validate the proposed method.
Article
In modern manufacturing processes, massive amounts of multivariate data are routinely collected through automated in-process sensing. These data often exhibit high correlation, rank deficiency, low signal-to-noise ratio and missing values. Conventional univariate and multivariate statistical process control techniques are not suitable to be used in these environments. This article discusses these issues and advocates the use of multivariate statistical process control based on principal component analysis (MSPC-PCA) as an efficient statistical tool for process understanding, monitoring and diagnosing assignable causes for special events in these contexts. Data from an autobody assembly process are used to illustrate the practical benefits of using MSPC-PCA rather than conventional SPC in manufacturing processes.
Article
Sensor fault can be detected and corrected in a multichannel measurement system with enough redundancy using solely the measurement data. A single or multiple sensors can be estimated from the remaining sensors if training data from the functioning sensor network are available. The method is based on the minimum mean square error (MMSE) estimation, which is applied to the time history data, e.g. accelerations. The faulty sensor can be identified and replaced with the estimated sensor. Both spatial and temporal correlation of the sensors can be utilized. Using the temporal correlation is justified if the number of active structural modes is larger than the number of sensors. The disadvantages of the temporal model are discussed. Experimental multichannel vibration measurements are used to verify the proposed method. Different, and also simultaneous, sensor faults are studied. The effects of environmental variability and structural damage are discussed.
Article
In this paper, three latent variable methods are implemented in a multivariate statistical analysis scheme for detecting and identifying faults in a multi-sensor network. The proposed methods are applied to a sensor network monitoring a MDOF dynamical system and the results are presented, compared and discussed.
Article
Analytical solutions are derived for calculating natural ventilation flow rates and air temperatures in a single-zone building with two openings when no thermal mass is present. In these solutions, the independent variables are the heat source strength and wind speed, rather than given indoor air temperatures. Three air change rate parameters α, β and γ are introduced to characterise, respectively, the effects of the thermal buoyancy force, the envelope heat loss and the wind force. Non-dimensional graphs are presented for calculating ventilation flow rates and air temperatures, and for sizing ventilation openings. The wind can either assist the buoyancy force or oppose the airflow. For assisting winds, the flow is always upwards and the solutions are straightforward. For opposing winds, the flow can be either upwards or downwards depending on the relative strengths of the two forces. In this case, the solution for the flow rate as a function of the heat source strength presents some complex features. A simple dynamical analysis is carried out to identify the stable solutions.
Article
Even though there has been a recent interest in the use of principal component analysis (PCA) for sensor fault detection and identification, few identification schemes for faulty sensors have considered the possibility of an abnormal operating condition of the plant. This article presents the use of PCA for sensor fault identification via reconstruction. The principal component model captures measurement correlations and reconstructs each variable by using iterative substitution and optimization. The transient behavior of a number of sensor faults in various types of residuals is analyzed. A sensor validity index (SVI) is proposed to determine the status of each sensor. On-line implementation of the SVI is examined for different types of sensor faults. The way the index is filtered represents an important tuning parameter for sensor fault identification. An example using boiler process data demonstrates attractive features of the SVI.
Article
Experiments were conducted to study the effect of mechanically induced fresh-air ventilation on the indoor air quality (IAQ) of the Tuskegee Healthy House (THH), selecting the outdoor weather conditions almost identical during the “fan OFF” and “fan ON” periods. Measurements of outdoor and indoor temperature and relative humidity (RH), in addition to the indoor dust particle concentration levels and interior wall moisture content, were systematically carried out during the summer month of August 2008. Results show that the effect of mechanically induced ventilation (“fan ON” period) is to raise the indoor RH, interior wall moisture content, and indoor dust particle concentration values significantly above those measured during the “fan OFF” period. The indoor temperature increases only slightly during the “fan ON” period.
Article
This paper presents a chronological overview of the developments in interpolation theory, from the earliest times to the present date. It brings out the connections between the results obtained in different ages, thereby putting the techniques currently used in signal and image processing into historical perspective. A summary of the insights and recommendations that follow from relatively recent theoretical as well as experimental studies concludes the presentation
How to Handle Missing Data
  • A Swalin