University of Luxembourg
  • Esch-sur-Alzette, Luxembourg
Recent publications
Additive Manufacturing (AM) of metallic objects is the process of adding layer-upon-layer of material to produce a hard-compact physical object. During the deposition, many process parameters, such as thermal history, power, material feeding rate, microstructure, etc., are involved and need to be regulated. The soar of the 3D printing process in various sectors has recently added a set of challenges, creating a massive demand for an automated process monitor to ensure a good deposition. In this paper, a literature review of vision-based techniques in additive manufacturing process is presented. A molten pool profile extraction methodology is performed. The image processing results can be used to create a visual monitoring technique and inspire future research in the use of vision sensing in 3D printing process.
The number line estimation task is an often-used measure of numerical magnitude understanding. The task also correlates substantially with broader measures of mathematical achievement. This raises the question of whether the task would be a useful component of mathematical achievement tests and instruments to diagnose dyscalculia or mathematical giftedness and whether a stand-alone version of the task can serve as a short screener for mathematical achievement. Previous studies on the relation between number line estimation accuracy and broader mathematical achievement were limited in that they used relatively small nonrepresentative samples and usually did not account for potentially confounding variables. To close this research gap, we report findings from a population-level study with nearly all Luxembourgish ninth-graders (N = 6484). We used multilevel regressions to test how a standardized mathematical achievement test relates to the accuracy in number line estimation on bounded number lines with whole numbers and fractions. We also investigated how these relations were moderated by classroom characteristics, person characteristics, and trial characteristics. Mathematical achievement and number line estimation accuracy were associated even after controlling for potentially confounding variables. Subpopulations of students showed meaningful differences in estimation accuracy, which can serve as benchmarks in future studies. Compared with the number line estimation task with whole numbers, the number line estimation task with fractions was more strongly related to mathematical achievement in students across the entire mathematical achievement spectrum. These results show that the number line estimation task is a valid and useful tool for diagnosing and monitoring mathematical achievement.
The chemical pollution crisis severely threatens human and environmental health globally. To tackle this challenge the establishment of an overarching international science–policy body has recently been suggested. We strongly support this initiative based on the awareness that humanity has already likely left the safe operating space within planetary boundaries for novel entities including chemical pollution. Immediate action is essential and needs to be informed by sound scientific knowledge and data compiled and critically evaluated by an overarching science–policy interface body. Major challenges for such a body are (i) to foster global knowledge production on exposure, impacts and governance going beyond data-rich regions (e.g., Europe and North America), (ii) to cover the entirety of hazardous chemicals, mixtures and wastes, (iii) to follow a one-health perspective considering the risks posed by chemicals and waste on ecosystem and human health, and (iv) to strive for solution-oriented assessments based on systems thinking. Based on multiple evidence on urgent action on a global scale, we call scientists and practitioners to mobilize their scientific networks and to intensify science–policy interaction with national governments to support the negotiations on the establishment of an intergovernmental body based on scientific knowledge explaining the anticipated benefit for human and environmental health.
This study investigates the impact of flows between bond and equity funds on investment factors over the period 1984–2015. It determines contemporaneous mispricing effects and a statistical reversal relation between these flows and both legs of the investment factor. The statistical reversal relationship between previous flows and the investment factor is economically significant. A one-standard-deviation shock to flows causes a 0.29% decrease in investment factor returns, which are reversed within 5 months. A trading strategy based on signals from past flows and the investment factor outperforms the market by 0.68% in the months following positive flows and produces significant alphas after accounting for well-known equity risk factors. The findings are interpreted as evidence in favor of a behavioral explanation, in which sentiment influences actual managerial decisions. When retail investors and managers are swept up in market euphoria, retail investors shift their holdings from bond to equity mutual funds, and high-investment firms invest more aggressively. Market-level euphoria has a different impact on high- and low-investment firms, and thus the investment factor can be influenced. Hence, the mispricing occurs during these periods, and the reversal relationship is especially pronounced for a high-investment portfolio versus a low-investment portfolio. As a result, during the months following periods of positive flows, the investment factor outperforms the market factor. Interestingly, this study’s measure of flows, which serves as a proxy for market-level euphoria, outperforms other measures of investor sentiment.
In the context of Industry 4.0, companies understand the advantages of performing Predictive Maintenance (PdM). However, when moving towards PdM, several considerations must be carefully examined. First, they need to have a sufficient number of production machines and relative fault data to generate maintenance predictions. Second, they need to adopt the right maintenance approach, which, ideally, should self-adapt to the machinery, priorities of the organization, technician skills, but also to be able to deal with uncertainty. Reinforcement learning (RL) is envisioned as a key technique in this regard due to its inherent ability to learn by interacting through trials and errors, but very few RL-based maintenance frameworks have been proposed so far in the literature, or are limited in several respects. This paper proposes a new multi-agent approach that learns a maintenance policy performed by technicians, under the uncertainty of multiple machine failures. This approach comprises RL agents that partially observe the state of each machine to coordinate the decision-making in maintenance scheduling, resulting in the dynamic assignment of maintenance tasks to technicians (with different skills) over a set of machines. Experimental evaluation shows that our RL-based maintenance policy outperforms traditional maintenance policies (incl., corrective and preventive ones) in terms of failure prevention and downtime, improving by ≈75% the overall performance.
Organic Rankine cycles employing carbon dioxide (CO2) for waste heat recovery became popular in the last years thanks to its excellent heat transfer characteristics and small environmental footprint. Low-grade waste heat (<240 °C) represents the major portion of excess heat globally, but it is hard to recover due to the small temperature gap of heat source and heat sink leading to a poor efficiency of the Rankine cycle. Therefore, numerous modifications of the power cycle layout were proposed by academia and industry — reheated expansion, recuperation and intercooled compression among them. This work compares ten cycle architectures for a defined waste heat source (60–100 °C) and heat sink (20 °C). Firstly, CO2 cycle architectures from literature are examined with its original operational parameters. Secondly, the predefined low-grade heat source is implemented into the cycle. The cycles are assessed regarding efficiency, mass flow and pressure. Results show that for source temperatures higher than 80 °C, recuperation and reheated expansion enhance the cycle performance whereas intercooled compression negatively affects the efficiency. The conventional configuration operated most efficiently for temperatures until 80 °C. A road map of thermodynamic efficiencies of CO2 cycle architectures for low-grade waste heat recovery up to 100 °C is delivered.
There has been a growing interest in controlled heat flux manipulation to increase the efficiency of thermal apparatus. Heat manipulators control and manipulate heat flow. A key to the effective performance of these heat manipulators is their thermal design. Such designs can be achieved by the materials specially engineered to have outstanding properties that can not be achieved with natural materials (known as metamaterials or meta-structure), whose geometry and material properties can be optimized for a specific objective. In this work, we focus on thermal metamaterial-based heat manipulators such as thermal concentrator (which concentrates the heat flux in a specified region of the domain). The main scope of the current work is to optimize the shape of the heat manipulators using Particle Swarm Optimization (PSO) method. The geometry is defined using NURBS basis functions due to the higher smoothness and continuity and the thermal boundary value problem is solved using Isogeometric Analysis (IGA). Often, nodes as design variables (as in Lagrange finite element method) generate the serrate shapes of boundaries which need to be smoothened later. For the NURBS-based boundary with the control points as design variables, the required smoothness can be predefined through knot vectors and smoothening in the post-processing can be avoided. The optimized shape generated by PSO is compared with the other shape exploited in the literature. The effects of the number of design variables, the thermal conductivity of the materials used, as well as some of the geometry parameters on the optimum shapes are also demonstrated.
In real-time systems, priorities assigned to real-time tasks determine the order of task executions, by relying on an underlying task scheduling policy. Assigning optimal priority values to tasks is critical to allow the tasks to complete their executions while maximizing safety margins from their specified deadlines. This enables real-time systems to tolerate unexpected overheads in task executions and still meet their deadlines. In practice, priority assignments result from an interactive process between the development and testing teams. In this article, we propose an automated method that aims to identify the best possible priority assignments in real-time systems, accounting for multiple objectives regarding safety margins and engineering constraints. Our approach is based on a multi-objective, competitive coevolutionary algorithm mimicking the interactive priority assignment process between the development and testing teams. We evaluate our approach by applying it to six industrial systems from different domains and several synthetic systems. The results indicate that our approach significantly outperforms both our baselines, i.e., random search and sequential search, and solutions defined by practitioners. Our approach scales to complex industrial systems as an offline analysis method that attempts to find near-optimal solutions within acceptable time, i.e., less than 16 hours.
Timely patching (i.e., the act of applying code changes to a program source code) is paramount to safeguard users and maintainers against dire consequences of malicious attacks. In practice, patching is prioritized following the nature of the code change that is committed in the code repository. When such a change is labeled as being security-relevant, i.e., as fixing a vulnerability, maintainers rapidly spread the change, and users are notified about the need to update to a new version of the library or of the application. Unfortunately, oftentimes, some security-relevant changes go unnoticed as they represent silent fixes of vulnerabilities. In this paper, we propose SSPCatcher, a Co-Training-based approach to catch security patches (i.e., patches that address vulnerable code) as part of an automatic monitoring service of code repositories. Leveraging different classes of features, we empirically show that such automation is feasible and can yield a precision of over 80% in identifying security patches, with an unprecedented recall of over 80%. Beyond such a benchmarking with ground truth data which demonstrates an improvement over the state-of-the-art, we confirmed that SSPCatcher can help catch security patches that were not reported as such.
One out of every three children under age 5 in developing countries lives in conditions that impede human capital development. In this study, we survey the literature on parenting training programs implemented before age 5, with the aim to increase parental investment in human capital accumulation in developing countries. Our review focuses on the implementation and effectiveness of parenting training programs (i.e., training in child psychosocial stimulation and/or training about nutrition). We emphasize the mechanisms that drive treatment-induced change in human capital outcomes and identify the demand- and supply-side behaviors that affect efficacy and effectiveness. Although the literature includes evidence on program features that are associated with successful interventions, further evidence on the dynamics of human capital formation, documentation of medium- to long-term persistence of treatment impacts, and research on the implementation and evaluation of programs at scale are needed to delineate a scalable and inclusive program that provides long-term treatment impacts. Expected final online publication date for the Annual Review of Resource Economics, Volume 14 is October 2022. Please see http://www.annualreviews.org/page/journal/pubdates for revised estimates.
We study the total -powered length of the rooted edges in a random minimal directed spanning tree — first introduced in Bhatt and Roy (2004) — on a Poisson process with intensity on the unit cube for . While a Dickman limit was proved in Penrose and Wade (2004) in the case of , in dimensions three and higher, Bai, Lee and Penrose (2006) showed a Gaussian central limit theorem when , with a rate of convergence of the order . In this article, we extend these results and prove a central limit theorem in any dimension for any . Moreover, making use of recent results in Stein's method for region-stabilizing functionals, we provide presumably optimal non-asymptotic bounds of the order on the Wasserstein and the Kolmogorov distances between the distribution of the total -powered length of rooted edges, suitably normalized, and that of a standard Gaussian random variable.
Electrocaloric materials in the shape of multilayer capacitors are excellent working bodies to build cooling devices as they can exhibit large changes in temperature triggered by changes in voltage over large range of temperatures. However, high electric fields are required to drive such a large effect and yet, their efficiency has been overlooked. Here we report on the materials efficiency of lead scandium tantalate multilayer capacitors. Hence, we show that they can exchange 25 times more heat than the electrical work required to generate this heat. Besides, we observed that these multilayers capacitors are four times less efficient than lead scandium tantalate bulk ceramics for the same heat exchanged, though they exhibit a large electrocaloric effect on a much larger temperature range. Moreover, their efficiency is comparable to Gd, the prototypical magnetocaloric material. This work ultimately provides an extensive picture of the pros and cons of lead scandium tantalate.
This article develops a theoretical model about the trajectories and transitions of thought from a perspective of semiotic cultural psychology. An integration between the inner speech theory, concept formation, and dialogical self theory was done to explore the particularity of thought transitions. It is concluded that the thought transits in a vertical and horizontal axis—from an irrevocable past to an uncertain future and from the lower levels of consciousness to the higher levels of thought—determined by the nature of inner speech—structural and semantic—the quality of concept formation process and the different dialogical relationships that occur between the I-positions of the self. It is proposed that it is these dynamics of thought that make it an idiosyncratic, historical, and genetic phenomenon, which makes empirical approaches difficult and influences theorizing about the thinking process.
Let K be a number field, and let G be a finitely generated subgroup of K×\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$K^\times $$\end{document}. We are interested in computing the degree of the cyclotomic-Kummer extension K(Gn)\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$K(\root n \of {G})$$\end{document} over K, where Gn\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\root n \of {G}$$\end{document} consists of all n-th roots of the elements of G. We develop the theory of entanglements introduced by Lenstra, and we apply it to compute the above degrees.
Awareness of the adverse effects of exposure to pollutant mixtures, possibly much more severe than individual chemicals, has drawn attention towards the necessity of using multi-residue methods to obtain the most possible comprehensive information on exposome. Among the different biological matrices used for exposure assessment, hair enables to detect the largest number of chemicals, including many classes such as persistent pollutants, hydrophilic metabolites and metals. Most biomonitoring studies are however focused on a limited number of pollutants and only give a partial information on exposure. Combining several multi-residue methods, the present study aimed at assessing the exposure of a population to an extensive variety of chemicals by hair analysis. One hair sample was collected from each participant (55 children and 134 adults). Samples were analysed with three different multi-residue methods, targeting, respectively, 152 organic pollutants (pesticides, PCBs, bisphenols, PBDEs), 62 polycyclic aromatic hydrocarbons (PAHs) and metabolites, nicotine and cotinine and 36 metals. From 33 to 70 organic chemicals were detected in each child’s hair sample, and from 34 up to 74 in adults. From 7 to 26 PAH were detected per child, and 7 to 21 in adults. Twenty-three to 27 metals were detected per child and 21 to 28 per adult. The highest median concentration were observed for zinc (143 μg /mg in children; 164 μg /mg in adults), bisphenol A (95.9 pg/mg in children; 64.7 pg/mg in adults) and nicotine (66.4 pg/mg in children; 51.9 pg/mg in adults). The present study provides the most comprehensive exposure assessment ever and highlights the simultaneous exposure to multiple classes of pollutants in the general population. The results support the use of multi-residue methods for future studies on exposure-associated effects, to document exposome and better consider the effect of chemical mixtures.
A monolithic numerical scheme for fluid–structure interaction with special interest in thin-walled piezoelectric energy harvesters driven by fluid is proposed. Employing a beam/shell model for the thin-walled structure in this particular application creates a FSI problem in which an (n−1)-dimensional structure is embedded in an n-dimensional fluid flow. This choice induces a strongly discontinuous pressure field along the moving fluid–solid interface. We overcome this challenge within a continuous finite element framework by a splitting-fluid-domain approach. The governing equations of the multiphysics problem are solved in a simultaneous fashion in order to reliably capture the main dynamic characteristics of the strongly-coupled system that involves a large deformation piezoelectric composite structure, an integrated electric circuit and an incompressible viscous fluid. The monolithic solution scheme is based on the weighted residuals method, with the boundary-fitted finite element method used for the discretization in space, and the generalized-α method for discretization in time. The proposed framework is evaluated against reference data of two popular FSI benchmark problems. Two additional numerical examples of flow-driven thin-walled piezoelectric energy harvesters demonstrate the feasibility of the framework to predict controlled cyclic response and limit-cycle oscillations and thus the power output in typical operational states of this class of energy harvesting devices.
Firms use a variety of practices to disclose the knowledge generated by their R&D activities, including, but not limited to, publishing findings in scientific journals, patenting new technologies, and contributing to developing standards. While the individual effects of engaging in the listed practices on firm innovation are well-understood, the existing literature has not considered their interrelation. Therefore, our study examines if the three practices are complements, substitutes, or unrelated in terms of firms’ performance with product innovations new to the market. Our analysis builds on a sample of innovation-active firms from the German Community Innovation Survey, which includes information on the development of standards, enhanced with information on firms' engagement in patenting and publishing. We find that 26% of innovation-active firms engage in at least one of the three practices, and 22% of engaging firms combine them. Using supermodularity tests, we show that publishing and patenting as well as patenting and developing standards are substitutes. Publishing and developing standards are not significantly linked. Based on our findings, we derive implications for innovation management and policy.
Institution pages aggregate content on ResearchGate related to an institution. The members listed on this page have self-identified as being affiliated with this institution. Publications listed on this page were identified by our algorithms as relating to this institution. This page was not created or approved by the institution. If you represent an institution and have questions about these pages or wish to report inaccurate content, you can contact us here.
3,217 members
Symeon Chatzinotas
  • Interdisciplinary Centre for Security, Reliability and Trust
Hugues Nicolay
  • Faculty of Law, Economics and Finance
Anne Grünewald
  • Luxembourg Centre for Systems Biomedicine (LCSB)
Anja K Leist
  • Department of Social Sciences
Information
Address
2 Avenue de l'Universite, L-4365, Esch-sur-Alzette, Luxembourg
Website
http://wwwen.uni.lu