George Besseris’s research while affiliated with Kingston University and other places

What is this page?


This page lists works of an author who doesn't have a ResearchGate profile or hasn't added the works to their profile yet. It is automatically generated from public (personal) data to further our legitimate goal of comprehensive and accurate scientific recordkeeping. If you are this author and want this page removed, please let us know.

Publications (13)


Non-Linear Saturated Multi-Objective Pseudo-Screening Using Support Vector Machine Learning, Pareto Front, and Belief Functions: Improving Wastewater Recycling Quality
  • Article
  • Full-text available

October 2024

·

14 Reads

George Besseris

Increasing wastewater treatment efficiency is a primary aim in the circular economy. Wastewater physicochemical and biochemical processes are quite complex, often requiring a combination of statistical and machine learning tools to empirically model them. Since wastewater treatment plants are large-scale operations, the limited opportunities for extensive experimentation may be offset by miniaturizing experimental schemes through the use of fractional factorial designs (FFDs). A recycling quality improvement study that relies on non-linear multi-objective multi-parameter FFD (NMMFFD) datasets was reanalyzed. A published NMMFFD ultrafiltration screening/optimization case study was re-examined regarding how four controlling factors affected three paper mill recycling characteristic responses using a combination of statistical and machine learning methods. Comparative machine learning screening predictions were provided by (1) quadratic support vector regression and (2) optimizable support vector regression, in contrast to quadratic linear regression. NMMFFD optimization was performed by employing Pareto fronts. Pseudo-screening was applied by decomposing the replicated NMMFFD dataset to single replicates and then testing their replicate repeatability by introducing belief functions that sought to maximize credibility and plausibility estimates. Various versions of belief functions were considered, since the novel role of the three process characteristics, as independent sources, created a high level of conflict during the information fusion phase, due to the inherent divergent belief structures. Correlations between two characteristics, but with opposite goals, may also have contributed to the source conflict. The active effects for the NMMFFD dataset were found to be the transmembrane pressure and the molecular weight cut-off. The modified adjustment was pinpointed to the molecular weight cut-off at 50 kDa, while the optimal transmembrane pressure setting persisted at 2.0 bar. This mixed-methods approach may provide additional confidence in determining improved recycling process adjustments. It would be interesting to implement this approach in polyfactorial wastewater screenings with a greater number of process characteristics.

Download

Lean-and-Green Fractional Factorial Screening of 3D-Printed ABS Mechanical Properties Using a Gibbs Sampler and a Neutrosophic Profiler

July 2024

·

31 Reads

·

2 Citations

The use of acrylonitrile butadiene styrene (ABS) in additive manufacturing applications constitutes an elucidating example of a promising match of a sustainable material to a sustainable production process. Lean-and-green datacentric-based techniques may enhance the sustainability of product-making and process-improvement efforts. The mechanical properties—the yield strength and the ultimate compression strength—of 3D-printed ABS product specimens are profiled by considering as many as eleven controlling factors at the process/product design stage. A fractional-factorial trial planner is used to sustainably suppress by three orders of magnitude the experimental needs for materials, machine time, and work hours. A Gibbs sampler and a neutrosophic profiler are employed to treat the complex production process by taking into account potential data uncertainty complications due to multiple distributions and indeterminacy issues due to inconsistencies owing to mechanical testing conditions. The small-data multifactorial screening outcomes appeared to steadily converge to three factors (the layer height, the infill pattern angle, and the outline overlap) with a couple of extra factors (the number of top/bottom layers and the infill density) to supplement the linear modeling effort and provide adequate predictions for maximizing the responses of the two examined mechanical properties. The performance of the optimal 3D-printed ABS specimens exhibited sustainably acceptable discrepancies, which were estimated at 3.5% for the confirmed mean yield strength of 51.70 MPa and at 5.5% for the confirmed mean ultimate compression strength of 53.58 MPa. The verified predictors that were optimally determined from this study were (1) the layer thickness—set at 0.1 mm; (2) the infill angle—set at 0°; (3) the outline overlap—set at 80%; (4) the number of top/bottom layers—set at 5; and (5) the infill density—set at 100%. The multifactorial datacentric approach composed of a fractional-factorial trial planner, a Gibbs sampler, and a neutrosophic profiler may be further tested on more intricate materials and composites while introducing additional product/process characteristics.


Lean-and-Green Datacentric Engineering in Laser Cutting: Non-Linear Orthogonal Multivariate Screening Using Gibbs Sampling and Pareto Frontier

February 2024

·

25 Reads

·

1 Citation

Metal processing may benefit from innovative lean-and-green datacentric engineering techniques. Broad process improvement opportunities in the efficient usage of materials and energy are anticipated (United Nations Sustainable Development Goals #9, 12). A CO2 laser cutting method is investigated in this study in terms of product characteristics (surface roughness (SR)) and process characteristics (energy (EC) and gas consumption (GC) as well as cutting time (CT)). The examined laser cutter controlling factors were as follows: (1) the laser power (LP), (2) the cutting speed (CS), (3) the gas pressure (GP) and, (4) the laser focus length (F). The selected 10mm-thick carbon steel (EN10025 St37-2) workpiece was arranged to have various geometric configurations so as to simulate a variety of real industrial milling demands. Non-linear saturated screening/optimization trials were planned using the Taguchi-type L9(34) orthogonal array. The resulting multivariate dataset was treated using a combination of the Gibbs sampler and the Pareto frontier method in order to approximate the strength of the studied effects and to find a solution that comprises the minimization of all the tested process/product characteristics. The Pareto frontier optimal solution was (EC, GC, CT, SR) = (4.67 kWh, 20.35 Nm3, 21 s, 5.992 μm) for the synchronous screening/optimization of the four characteristics. The respective factorial settings were optimally adjusted at the four inputs (LP, CS, GP, F) located at (4 kW, 1.9 mm/min, 0.75 bar, +2.25 mm). The linear regression analysis was aided by the Gibbs sampler and promoted the laser power and the cutting speed on energy consumption to be stronger effects. Similarly, a strong effect was identified of the cutting speed and the gas pressure on gas consumption as well as a reciprocal effect of the cutting speed on the cutting time. Further industrial explorations may involve more intricate workpiece geometries, burr formation phenomena, and process economics.


Datacentric Similarity Matching of Emergent Stigmergic Clustering to Fractional Factorial Vectoring: A Case for Leaner-and-Greener Wastewater Recycling

October 2023

·

25 Reads

·

1 Citation

Water scarcity is a challenging global risk. Urban wastewater treatment technologies, which utilize processes based on single-stage ultrafiltration (UF) or nanofiltration (NF), have the potential to offer lean-and-green cost-effective solutions. Robustifying the effectiveness of water treatment is a complex multidimensional characteristic problem. In this study, a non-linear Taguchi-type orthogonal-array (OA) sampler is enriched with an emergent stigmergic clustering procedure to conduct the screening/optimization of multiple UF/NF aquametric performance metrics. The stochastic solver employs the Databionic swarm intelligence routine to classify the resulting multi-response dataset. Next, a cluster separation measure, the Davies–Bouldin index, is used to evaluate input and output relationships. The self-organized bionic-classifier data-partition appropriateness is matched for signatures between the emergent stigmergic clustering memberships and the OA factorial vector sequences. To illustrate the proposed methodology, recently-published multi-response multifactorial L9(34) OA-planned experiments from two interesting UF-/NF-membrane processes are examined. In the study, seven UF-membrane process characteristics and six NF-membrane process characteristics are tested (1) in relationship to four controlling factors and (2) to synchronously evaluate individual factorial curvatures. The results are compared with other ordinary clustering methods and their performances are discussed. The unsupervised robust bionic prediction reveals that the permeate flux influences both the UF-/NF-membrane process performances. For the UF process and a three-cluster model, the Davies–Bouldin index was minimized at values of 1.89 and 1.27 for the centroid and medoid centrotypes, respectively. For the NF process and a two-cluster model, the Davies–Bouldin index was minimized for both centrotypes at values close to 0.4, which was fairly close to the self-validation value. The advantage of this proposed data-centric engineering scheme relies on its emergent and self-organized clustering capability, which retraces its appropriateness to the fractional factorial rigid structure and, hence, it may become useful for screening and optimizing small-data wastewater operating conditions.


Figure 4. Normal P-P plot of the regression standardized residuals for the dependent variable EC (IBM SPSS v0.29).
Figure 5. Gap statistic performance for profiling optimal clustering size for the summarized super saturated dataset of Table 17.
Figure 6. Cluster quality rating using the silhouette measure of cohesion and separation (IBM SPSS v.29).
Figure 8. Individually contrasting the clustered supersaturated datasets for their four summarizing estimators: (A) median (M), (B) interquartile range (I), (C) skewness (S), and (D) kurtosis (K).
Apartment unit shell structural surfaces.

+13

Using Lean-and-Green Supersaturated Poly-Factorial Mini Datasets to Profile Energy Consumption Performance for an Apartment Unit

June 2023

·

72 Reads

·

2 Citations

The Renovation Wave for Europe initiative aspires to materialize the progressive greening of 85–95% of the continental older building stock as part of the European Green Deal objectives to reduce emissions and energy use. To realistically predict the energy performance even for a single apartment building is a difficult problem. This is because an apartment unit is inherently a customized construction which is subject to year-round occupant use. We use a standardized energy consumption response approach to accelerate the setting-up of the problem in pertinent energy engineering terms. Nationally instituted Energy Performance Certification databases provide validated energy consumption information by taking into account an apartment unit’s specific shell characteristics along with its installed electromechanical system configuration. Such a pre-engineered framework facilitates the effect evaluation of any proposed modifications on the energy performance of a building. Treating a vast building stock requires a mass-customization approach. Therefore, a lean-and-green, industrial-level problem-solving strategy is pursued. The TEE-KENAK Energy Certification database platform is used to parametrize a real standalone apartment. A supersaturated mini dataset was planned and collected to screen as many as 24 controlling factors, which included apartment shell layout details in association with the electromechanical systems arrangements. Main effects plots, best-subsets partial least squares, and entropic (Shannon) mutual information predictions—supplemented with optimal shrinkage estimations—formed the recommended profiler toolset. Four leading modifications were found to be statistically significant: (1) the thermal insulation of the roof, (2) the gas-sourced heating systems, (3) the automatic control category type ‘A’, and (4) the thermal insulation of the walls. The optimal profiling delivered an energy consumption projection of 110.4 kWh/m2 (energy status ‘B’) for the apartment—an almost 20% reduction in energy consumption while also achieving upgrading from the original ‘C’ energy status. The proposed approach may aid energy engineers to make general empirical screening predictions in an expedient manner by simultaneously considering the apartment unit’s structural configuration as well as its installed electromechanical systems arrangement.


Lean-and-Green Strength Performance Optimization of a Tube-to-Tubesheet Joint for a Shell-and-Tube Heat Exchanger Using Taguchi Methods and Random Forests

April 2023

·

149 Reads

The failing tube-to-tubesheet joint is identified as a primary quality defect in the fabrication of a shell-and-tube heat exchanger. Operating in conditions of high pressure and temperature, a shell-and-tube heat exchanger may be susceptible to leakage around faulty joints. Owing to the ongoing low performance of the adjacent tube-to-tubesheet expansion, the heat exchanger eventually experiences malfunction. A quality improvement study on the assembly process is necessary in order to delve into the tight-fitting of the tube-to-tubesheet joint. We present a non-linear screening and optimization study of the tight-fitting process of P215NL (EN 10216-4) tube samples on P265GH (EN 10028-2) tubesheet specimens. A saturated fractional factorial scheme was implemented to screen and optimize the tube-to-tubesheet expanded-joint performance by examining the four controlling factors: (1) the clearance, (2) the number of grooves, (3) the groove depth, and (4) the tube wall thickness reduction. The adopted ‘green’ experimental tactic required duplicated tube-push-out test trials to form the ‘lean’ joint strength response dataset. Analysis of variance (ANOVA) and regression analysis were subsequently employed in implementing the Taguchi approach to accomplish the multifactorial non-linear screening classification and the optimal setting adjustment of the four investigated controlling factors. It was found that the tube-wall thickness reduction had the highest influence on joint strength (55.17%) and was followed in the screening hierarchy by the number of grooves (at 30.47%). The groove depth (at 7.20%) and the clearance (at 6.84%) were rather weaker contributors, in spite of being evaluated to be statistically significant. A confirmation run showed that the optimal joint strength prediction was adequately estimated. Besides exploring the factorial hierarchy with statistical methods, an algorithmic (Random Forest) approach agreed with the leading effects line-up (the tube wall thickness and the number of grooves) and offered an improved overall prediction for the confirmation-run test dataset.


Testing the eight time series for normality using three statistical tests.
Testing the eight time series for stationarity using two statistical tests.
Autoregressive and moving average coefficient statistics (fitted ARIMA model) for the eight temperature data sequences.
Testing Thermostatic Bath End-Scale Stability for Calibration Performance with a Multiple-Sensor Ensemble Using ARIMA, Temporal Stochastics and a Quantum Walker Algorithm

February 2023

·

51 Reads

Thermostatic bath calibration performance is usually checked for uniformity and stability to serve a wide range of industrial applications. Particularly challenging is the assessment at the limiting specification ends where the sensor system may be less effective in achieving consistency. An ensemble of eight sensors is used to test temperature measurement stability at various topological locations in a thermostatic bath (antifreeze) fluid at −20 °C. Eight streaks of temperature data were collected, and the resulting time-series were processed for normality, stationarity, and independence and identical distribution by employing regular statistical inference methods. Moreover, they were evaluated for autoregressive patterns and other underlying trends using classical Auto-Regressive Integrated Moving Average (ARIMA) modeling. In contrast, a continuous-time quantum walker algorithm was implemented, using an available R-package, in order to test the behavior of the fitted coefficients on the probabilistic node transitions of the temperature time series dataset. Tracking the network sequence for persistence and hierarchical mode strength was the objective. The quantum walker approach favoring a network probabilistic framework was posited as a faster way to arrive at simultaneous instability quantifications for all the examined time-series. The quantum walker algorithm may furnish expedient modal information in comparison to the classical ARIMA modeling and in conjunction with several popular stochastic analyzers of time-series stationarity, normality, and data sequence independence of temperature end-of-scale calibration datasets, which are investigated for temporal consistency.


Lean Screening for Greener Energy Consumption in Retrofitting a Residential Apartment Unit

June 2022

·

66 Reads

·

5 Citations

Buildings consume a large portion of the global primary energy. They are also key contributors to CO2 emissions. Greener residential buildings are part of the ‘Renovation Wave’ in the European Green Deal. The purpose of this study was to explore the usefulness of energy consumption screening as a part of seeking retrofitting opportunities in the older residential building stock. The objective was to manage the screening of the electromechanical energy systems for an existing apartment unit. The parametrization was drawn upon inspection items in a comprehensive electronic checklist—part of an official software—in order to incur the energy certification status of a residential building. The extensive empirical parametrization intends to discover retrofitting options while offering a glimpse of the influence of the intervention costs on the final screening outcome. A supersaturated trial planner was implemented to drastically reduce the time and volume of the experiments. Matrix data analysis chart-based sectioning and general linear model regression seamlessly integrate into a simple lean-and-agile solver engine that coordinates the polyfactorial profiling of the joint multiple characteristics. The showcased study employed a 14-run 24-factor supersaturated scheme to organize the data collection of the performance of the energy consumption along with the intervention costs. It was found that the effects that influence the energy consumption may be slightly differentiated if intervention costs are also simultaneously considered. The four strong factors that influenced the energy consumption were the automation type for hot water, the types of heating and cooling systems, and the power of the cooling systems. An energy certification category rating of ‘B’ was achieved; thus, the original status (‘C’) was upgraded. The renovation profiling practically reduced the energy consumption by 47%. The concurrent screening of energy consumption and intervention costs detected five influential effects—the automation type for water heating, the automation control category, the heating systems type, the location of the heating system distribution network, and the efficiency of the water heating distribution network. The overall approach was shown to be simpler and even more accurate than other potentially competitive methods. The originality of this work lies in its rareness, worldwide criticality, and impact since it directly deals with the energy modernization of older residential units while promoting greener energy performance.


Databionic Swarm Intelligence to Screen Wastewater Recycling Quality with Factorial and Hyper-Parameter Non-Linear Orthogonal Mini-Datasets

June 2022

·

33 Reads

Electrodialysis (ED) may be designed to enhance wastewater recycling efficiency for crop irrigation in areas where water distribution is otherwise inaccessible. ED process controls are difficult to manage because the ED cells need to be custom-built to meet local requirements, and the wastewater influx often has heterogeneous ionic properties. Besides the underlying complex chemical phenomena, recycling screening is a challenge to engineering because the number of experimental trials must be maintained low in order to be timely and cost-effective. A new data-centric approach is presented that screens three water quality indices against four ED-process-controlling factors for a wastewater recycling application in agricultural development. The implemented unsupervised solver must: (1) be fine-tuned for optimal deployment and (2) screen the ED trials for effect potency. The databionic swarm intelligence classifier is employed to cluster the L9(34) OA mini-dataset of: (1) the removed Na+ content, (2) the sodium adsorption ratio (SAR) and (3) the soluble Na+ percentage. From an information viewpoint, the proviso for the factor profiler is that it should be apt to detect strength and curvature effects against not-computable uncertainty. The strength hierarchy was analyzed for the four ED-process-controlling factors: (1) the dilute flow, (2) the cathode flow, (3) the anode flow and (4) the voltage rate. The new approach matches two sequences for similarities, according to: (1) the classified cluster identification string and (2) the pre-defined OA factorial setting string. Internal cluster validity is checked by the Dunn and Davies–Bouldin Indices, after completing a hyper-parameter L8(4122) OA screening. The three selected hyper-parameters (distance measure, structure type and position type) created negligible variability. The dilute flow was found to regulate the overall ED-based separation performance. The results agree with other recent statistical/algorithmic studies through external validation. In conclusion, statistical/algorithmic freeware (R-packages) may be effective in resolving quality multi-indexed screening tasks of intricate non-linear mini-OA-datasets.


Wastewater Quality Screening Using Affinity Propagation Clustering and Entropic Methods for Small Saturated Nonlinear Orthogonal Datasets

April 2022

·

29 Reads

·

4 Citations

Wastewater recycling efficiency improvement is vital to arid regions, where crop irrigation is imperative. Analyzing small, unreplicated–saturated, multiresponse, multifactorial datasets from novel wastewater electrodialysis (ED) applications requires specialized screening/optimization techniques. A new approach is proposed to glean information from structured Taguchi-type sampling schemes (nonlinear fractional factorial designs) in the case that direct uncertainty quantification is not computable. It uses a double information analysis–affinity propagation clustering and entropy to simultaneously discern strong effects and curvature type while profiling multiple water-quality characteristics. Three water quality indices, which are calculated from real ED process experiments, are analyzed by examining the hierarchical behavior of four controlling factors: (1) the dilute flow, (2) the cathode flow, (3) the anode flow, and (4) the voltage rate. The three water quality indices are: the removed sodium content, the sodium adsorption ratio, and the soluble sodium percentage. The factor that influences the overall wastewater separation ED performance is the dilute flow, according to both analyses’ versions. It caused the maximum contrast difference in the heatmap visualization, and it minimized the relative information entropy at the two operating end points. The results are confirmed with a second published independent dataset. Furthermore, the final outcome is scrutinized and found to agree with other published classification and nonparametric screening solutions. A combination of modern classification and simple entropic methods which are offered through freeware R-packages might be effective for testing high-complexity ‘small-and-dense’ nonlinear OA datasets, highlighting an obfuscated experimental uncertainty.


Citations (6)


... Moreover, it has been suggested that there is value in aligning DOE techniques-in the Lean Six Sigma methodologies-to artificial intelligence technology in industrial applications [54]. This work expands on several recent attempts to adopt and apply the broad know-how from modern algorithmic engines in order to profile state-of-the-art wastewater processes, counting on small-structured datasets to describe the influence of multiple inputs on multiple outputs [55][56][57]. ...

Reference:

Non-Linear Saturated Multi-Objective Pseudo-Screening Using Support Vector Machine Learning, Pareto Front, and Belief Functions: Improving Wastewater Recycling Quality
Datacentric Similarity Matching of Emergent Stigmergic Clustering to Fractional Factorial Vectoring: A Case for Leaner-and-Greener Wastewater Recycling

... Third, our work provides a significant, broad-ranging application for SSDs, which has been lacking in the literature. Despite their considerable promise (Weese et al., 2021), we only know of a handful of scattered articles that report on their use (Carpinteiro et al., 2004;Dejaegher and Vander Heyden, 2008;Jridi et al., 2015;Zarkadas and Besseris, 2023). Our work changes that, by applying these designs to a large class of biological/chemical screens and demonstrating their usefulness in real experiments. ...

Using Lean-and-Green Supersaturated Poly-Factorial Mini Datasets to Profile Energy Consumption Performance for an Apartment Unit

... Moreover, it has been suggested that there is value in aligning DOE techniques-in the Lean Six Sigma methodologies-to artificial intelligence technology in industrial applications [54]. This work expands on several recent attempts to adopt and apply the broad know-how from modern algorithmic engines in order to profile state-of-the-art wastewater processes, counting on small-structured datasets to describe the influence of multiple inputs on multiple outputs [55][56][57]. ...

Wastewater Quality Screening Using Affinity Propagation Clustering and Entropic Methods for Small Saturated Nonlinear Orthogonal Datasets

... Interestingly, it was suggested that it may be practical to pace the retrofitting process of older residential buildings by examining them on an apartment unit basis [91]. By taking advantage of the construction-design modularity and the comprehensive information on the electromechanical systems configuration, which are stored in the national building certification register platforms, EPC-generating software packages may be utilized to conveniently screen and optimize the energy-consumption performance of any residential apartment unit. ...

Lean Screening for Greener Energy Consumption in Retrofitting a Residential Apartment Unit

... Moreover, it has been suggested that there is value in aligning DOE techniques-in the Lean Six Sigma methodologies-to artificial intelligence technology in industrial applications [54]. This work expands on several recent attempts to adopt and apply the broad know-how from modern algorithmic engines in order to profile state-of-the-art wastewater processes, counting on small-structured datasets to describe the influence of multiple inputs on multiple outputs [55][56][57]. ...

Micro-Clustering and Rank-Learning Profiling of a Small Water-Quality Multi-Index Dataset to Improve a Recycling Process

... Tsarouhas and Arvanitoyannis (2010), applied the normal distribution to determine the failure times and the logistic distribution for the repair times, with the aim of identifying the best fit of failure data in a reliability study for the packaging of beer production. Tsarouhas and Besseris (2017), tested and selected an optimal statistical model after considering several distributions, allowing them to make estimates of availability for different periods of time. Later, Tsarouhas (2018) conducted an investigation to determine which probability distribution provides the best fit to characterize the failure pattern at the levels of machinery and production line. ...

Maintainability analysis in shaving blades industry: a case study
  • Citing Article
  • April 2017

International Journal of Quality & Reliability Management