Università degli Studi di Trento

• Trento, Italy
Recent publications
The accurate simulation of additional interactions at the ATLAS experiment for the analysis of proton–proton collisions delivered by the Large Hadron Collider presents a significant challenge to the computing resources. During the LHC Run 2 (2015–2018), there were up to 70 inelastic interactions per bunch crossing, which need to be accounted for in Monte Carlo (MC) production. In this document, a new method to account for these additional interactions in the simulation chain is described. Instead of sampling the inelastic interactions and adding their energy deposits to a hard-scatter interaction one-by-one, the inelastic interactions are presampled, independent of the hard scatter, and stored as combined events. Consequently, for each hard-scatter interaction, only one such presampled event needs to be added as part of the simulation chain. For the Run 2 simulation chain, with an average of 35 interactions per bunch crossing, this new method provides a substantial reduction in MC production CPU needs of around 20%, while reproducing the properties of the reconstructed quantities relevant for physics analyses with good accuracy.
Policy makers have implemented multiple non-pharmaceutical strategies to mitigate the COVID-19 worldwide crisis. Interventions had the aim of reducing close proximity interactions, which drive the spread of the disease. A deeper knowledge of human physical interactions has revealed necessary, especially in all settings involving children, whose education and gathering activities should be preserved. Despite their relevance, almost no data are available on close proximity contacts among children in schools or other educational settings during the pandemic. Contact data are usually gathered via Bluetooth, which nonetheless offers a low temporal and spatial resolution. Recently, ultra-wideband (UWB) radios emerged as a more accurate alternative that nonetheless exhibits a significantly higher energy consumption, limiting in-field studies. In this paper, we leverage a novel approach, embodied by the Janus system that combines these radios by exploiting their complementary benefits. The very accurate proximity data gathered in-field by Janus, once augmented with several metadata, unlocks unprecedented levels of information, enabling the development of novel multi-level risk analyses. By means of this technology, we have collected real contact data of children and educators in three summer camps during summer 2020 in the province of Trento, Italy. The wide variety of performed daily activities induced multiple individual behaviors, allowing a rich investigation of social environments from the contagion risk perspective. We consider risk based on duration and proximity of contacts and classify interactions according to different risk levels. We can then evaluate the summer camps’ organization, observe the effect of partition in small groups, or social bubbles, and identify the organized activities that mitigate the riskier behaviors. Overall, we offer an insight into the educator-child and child-child social interactions during the pandemic, thus providing a valuable tool for schools, summer camps, and policy makers to (re)structure educational activities safely.
The ATLAS experiment at the Large Hadron Collider has a broad physics programme ranging from precision measurements to direct searches for new particles and new interactions, requiring ever larger and ever more accurate datasets of simulated Monte Carlo events. Detector simulation with Geant4 is accurate but requires significant CPU resources. Over the past decade, ATLAS has developed and utilized tools that replace the most CPU-intensive component of the simulation—the calorimeter shower simulation—with faster simulation methods. Here, AtlFast3, the next generation of high-accuracy fast simulation in ATLAS, is introduced. AtlFast3 combines parameterized approaches with machine-learning techniques and is deployed to meet current and future computing challenges, and simulation needs of the ATLAS experiment. With highly accurate performance and significantly improved modelling of substructure within jets, AtlFast3 can simulate large numbers of events for a wide range of physics processes.
Graphite is a fascinating material with unique properties, thus making it irreplaceable for a wide range of applications. However, its current processing route is highly energy demanding as it requires dwelling for several hours at high temperatures (2500-3000°C). We report on the near full consolidation (relative density greater than 95%) at room temperature of graphite flakes under a mild uniaxial or isostatic pressure (100-500 MPa). The application of an external pressure promoted the formation of van der Walls bonds between the flakes, and the consolidation (pore removal) was mostly achieved by interplanar slipping. Despite the room temperature processing, with embodied energy below 1 MJ/kg, the resulting compact had in plane electrical and thermal conductivities as high as 0.77×10⁶ S/m and 620 W/m·K (exceeding commercial isotropic graphite ≈0.09×10⁶ S/m and 120 W/m·K). The bulks were thermally stable up to 1800°C. Because of the reversible nature on the van der Walls bonding, the cold pressed pellets were fully recyclable (i.e., easily milled and re-shaped) with a mild degradation of the electrical conductivity from 0.77 to 0.19×10⁶ S/m after ten cycles.
We present a family of relaxation models for thin films flows where both viscosity and surface tension effects are inherent. In a first step, a first-order hyperbolic approximation to the dissipationless part of the system is presented. The method is based on an augmented Lagrangian approach, where a classical penalty method is used and high-order derivatives in the Lagrangian are promoted to new independent variables, for which hyperbolic closure equations are sought. Then, we show that the viscous terms can be treated either by plugging them directly to the obtained system, making it of the hyperbolic-parabolic type or by casting them into an approximate algebraic source term that is asymptotically equivalent to the former formulation. Finally, the extension of the method to a classical nonlinear surface tension model is also presented. Numerical results, for all the proposed models are shown and compared with experimental results and reference solutions.
Italy was among the first countries to introduce drastic measures to reduce individual mobility in order to slow the diffusion of COVID-19. The first measures imposed by the central authorities on March 8, 2020, were unanticipated and highly localized, focusing on 26 provinces. Additional nationwide measures were imposed after one day, and were removed only after June 3. Looking at these watershed moments of the pandemic, this paper explores the impact of the adoption of localized restrictions on changes in individual mobility in Italy using a spatial discontinuity approach. Results show that these measures lowered individual mobility by 7 percentage points on top of the reduction in mobility recorded in the adjacent untreated areas. The study also fills a gap in the literature in that it looks at the changes in mobility after the nationwide restrictions were lifted and shows how the recovery in mobility patterns is related to various characteristics of local labour markets. Areas with a higher proportion of professions exposed to diseases, more suitable for flexible work arrangements, and with a higher share of fixed-term contracts before the pandemic are characterised by a smaller increase in mobility after re-opening.
Dealing with the detection of the faulty elements of planar antenna arrays with a probabilistic Bayesian compressive sensing (BCS) approach, a key asset for the reliable prediction of the actual status of the antenna under test is the sampling strategy to remotely collect the far‐field (FF) data. The aim of this letter is to provide insights into the effectiveness and the reliability of different FF sampling strategies to collect the input data for a state‐of‐the‐art array diagnosis method based on a multitask BCS technique. Representative results are shown to verify the impact of each sampling strategy on the achievable reconstructions.
In this paper an approach to the design of robust global attitude tracking controllers for fully actuated rigid bodies is proposed. The challenge of simultaneously dealing with topological obstructions to global attitude tracking and with disturbances affecting the attitude dynamics is tackled by means of a hybrid hierarchical design that exploits the cascade structure of the underlying mathematical model. The proposed hierarchical strategy is based on an inner–outer loop paradigm comprising a dynamic control law for angular velocity tracking (inner loop) and a hybrid control law for attitude tracking (outer loop). By leveraging recent tools for the stability analysis of hybrid systems, we prove a robust global tracking property by assuming mild properties on the dynamics of the velocity feedback. We also discuss a few relevant examples satisfying these properties, encompassing harmonic disturbance compensators and conditional integrators, capable of rejecting unknown constant disturbances with an intrinsic anti-windup action.
In this article, we present a flow-based framework for multi-modal trajectory prediction, which is able to provide an accurate and explicit inference of the latent representations on trajectory data. Differently from other typical generative models (such as GAN, VAE, etc.), the flow-based models aim at learning data distribution explicitly through an invertible network, which can convert a complicated distribution into a tractable form via invertible transformations. The whole framework is built upon the standard encoder-decoder architecture, where the LSTM is exploited as the fundamental block to capture the temporal structure of a trajectory. As a core module, we incorporate an invertible network that can learn the multi-modal distributions of trajectory data and further generate plausible future paths by sampling tricks from the standard Gaussian distribution. Extensive experiments carried out on synthetic and realistic datasets demonstrate the effectiveness of the proposed approach, and show the advantages as compared to the GAN-based and the VAE-based prediction frameworks.
Modern societies produce ever-increasing amounts of waste, e.g. organic fraction of municipal solid waste (OFMSW). According to the best available techniques, OFMSW should be treated through anaerobic digestion to recover biogas and subsequently composted. An innovative scheme is under investigation, where anaerobic digestion is combined with hydrothermal carbonization (HTC) and composting. The final product, referred to as hydrochar co-compost (HCO), is under study to be used as an unconventional soil improver/fertilizer. Recent studies showed that HCO is not phytotoxic. However, nothing is known about the toxicity of HCO on cells as part and organisms as a whole. This study aims to investigate in vitro genotoxicity and cytotoxicity of the HCO and its precursors in the production process. In particular, we tested water and methanolic extracts of HCO (WEHCO and MEHCO) from one side and methanolic extracts of hydrochar (MEH) and OFMSW digestate (MED) as well as liquor produced downstream HTC (HTCL) from the other side. Genotoxicity was investigated using cytokinesis-block micronucleus assay in Chinese Hamster Ovarian K1 (CHO-K1) cells. Cytotoxicity was tested in vitro against a panel of human cells line. Zebrafish embryo toxicity upon MEH treatment was also investigated. Results show that incubation of CHO-K1 cells with all the tested samples at different concentrations did not cause any induction of micronucleus formation compared to the vehicle-treated control. Treatment of cells with MEH, MED, HTCL and MEHCO, but not WEHCO, induced some degree of cytotoxicity and MEH showed to be more cytotoxic against tested cells compared to the MEHCO. Toxicity effect at the highest tested concentrations of MEH on zebrafish embryos resulted in coagulation, induction of pericardial edema and death. In conclusion, the hydrochar co-compost cytotoxicity is similar to standard compost cytotoxicity. Hence composting the hydrochar from OFMSW digestate is a good step to eliminate the cytotoxicity of hydrochar.
Homogenization of the incremental response of grids made up of preloaded elastic rods leads to homogeneous effective continua which may suffer macroscopic instability, occurring at the same time in both the grid and the effective continuum. This instability corresponds to the loss of ellipticity in the effective material and the formation of localized responses as, for instance, shear bands. Using lattice models of elastic rods, loss of ellipticity has always been found to occur for stress states involving compression of the rods, as usually these structural elements buckle only under compression. In this way, the locus of material stability for the effective solid is unbounded in tension, i.e. the material is always stable for a tensile prestress. A rigorous application of homogenization theory is proposed to show that the inclusion of sliders (constraints imposing axial and rotational continuity, but allowing shear jumps) in the grid of rods leads to loss of ellipticity in tension so that the locus for material instability becomes bounded . This result explains (i) how to design elastic materials subject to localization of deformation and shear banding for all radial stress paths; and (ii) how for all these paths a material may fail by developing strain localization and without involving cracking. This article is part of the theme issue ‘Wave generation and transmission in multi-scale complex media and structured metamaterials (part 1)’.
A structural element is designed and investigated, forming the basis for the development of an elastic multistable metamaterial. The leitmotif of the structural design is the implementation of a strut characterized by a bifurcation occurring at either vanishing tensile or compressive load. It is shown that buckling at null load leads to a mechanical equivalence with a unilateral constraint formulation, introducing shocks in dynamics. Towards a future analysis of the latter, the nonlinear quasi-static response is investigated, showing the multistable character of the structure, which may appear as bistable or tetrastable. This article is part of the theme issue ‘Wave generation and transmission in multi-scale complex media and structured metamaterials (part 1)’.
The g-and-h distribution is a flexible model for skewed and/or leptokurtic data, which has been shown to be especially effective in actuarial analytics and risk management. Since in these fields data are often recorded only above a certain threshold, we introduce a left-truncated g-and-h distribution. Given the lack of an explicit density, we estimate the parameters via an Approximate Maximum Likelihood approach that uses the empirical characteristic function as summary statistics. Simulation results and an application to fire insurance losses suggest that the method works well and that the explicit consideration of truncation is strongly preferable with respect the use of the non-truncated g-and-h distribution.
School closures, forcibly brought about by the COVID-19 crisis in many countries, have impacted children’s lives and their learning processes. The heterogeneous implementation of distance learning solutions is likely to bring a substantial increase in education inequality, with long term consequences. The present study uses data from a survey collected during Spring 2020 lockdown in France and Italy to analyze parents’ evaluations of their children’s home schooling process and emotional well-being at time of school closure, and the role played by different distance learning methods in shaping these perceptions. While Italian parents have a generally worse judgment of the effects of the lockdown on their children, the use of interactive distance learning methods appears to significantly attenuate their negative perception. This is particularly true for older pupils. French parents rather perceive that interactive methods are effective in mitigating learning losses and psychological distress only for their secondary school children. In both countries, further heterogeneity analysis reveal that parents perceive younger children and boys to suffer more during this period.
The evolution of High-Performance Computing (HPC) platforms enables the design and execution of progressively larger and more complex workflow applications in these systems. The complexity comes not only from the number of elements that compose the workflows but also from the type of computations they perform. While traditional HPC workflows target simulations and modelling of physical phenomena, current needs require in addition data analytics (DA) and artificial intelligence (AI) tasks. However, the development of these workflows is hampered by the lack of proper programming models and environments that support the integration of HPC, DA, and AI, as well as the lack of tools to easily deploy and execute the workflows in HPC systems. To progress in this direction, this paper presents use cases where complex workflows are required and investigates the main issues to be addressed for the HPC/DA/AI convergence. Based on this study, the paper identifies the challenges of a new workflow platform to manage complex workflows. Finally, it proposes a development approach for such a workflow platform addressing these challenges in two directions: first, by defining a software stack that provides the functionalities to manage these complex workflows; and second, by proposing the HPC Workflow as a Service (HPCWaaS) paradigm, which leverages the software stack to facilitate the reusability of complex workflows in federated HPC infrastructures. Proposals presented in this work are subject to study and development as part of the EuroHPC eFlows4HPC project.
By revisiting the notion of generalized second fundamental form originally introduced by Hutchinson for a special class of integral varifolds, we define a weak curvature tensor that is particularly well-suited for being extended to general varifolds of any dimension and codimension through regularization. The resulting approximate second fundamental forms are defined not only for piecewise-smooth surfaces, but also for datasets of very general type (like, e.g., point clouds). We obtain explicitly computable formulas for both weak and approximate curvature tensors, we exhibit structural properties and prove convergence results, and lastly we provide some numerical tests on point clouds that confirm the generality and effectiveness of our approach.
In this paper, we lay the foundations of the theory of slice regular functions in several (non-commuting) variables ranging in any real alternative ∗\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$^*$$\end{document}-algebra, including quaternions, octonions and Clifford algebras. This higher dimensional function theory is an extension of the classical theory of holomorphic functions of several complex variables. It is based on the construction of a family of commuting complex structures on R2n\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$${\mathbb {R}}^{2^n}$$\end{document}. One of the relevant aspects of the theory is the validity of a Cauchy-type integral formula and the existence of ordered power series expansions. The theory includes all polynomials and power series with ordered variables and right coefficients in the algebra. We study the real dimension of the zero set of polynomials in the quaternionic and octonionic cases and give some results about the zero set of polynomials with Clifford coefficients. In particular, we show that a nonconstant polynomial always has a non empty zero set.
Smart contracts (i.e., agreements enforced by a blockchain) are supposed to work at lower transaction costs than traditional (and incomplete) contracts that instead exploit a costly legal enforcement. This paper challenges that claim. I argue that because of the need for adaptation to mutable and unpredictable occurrences (a chief challenge of transaction cost economics à la Oliver Williamson), smart contracts may incur higher transaction costs than traditional contracts. This paper focuses on two problems related to the adaptation: first, smart contracts are constructed to limit and potentially avoid any ex-post legal intervention, including efficiency-enhancing adaptation by courts. Second, the consensus mechanism on which every smart contract depends may lead to additional transaction costs due to a majority-driven adaptation of the blockchain that follows Mancur Olson's Logic of groups. The paper further proposes several institutional expedients that may reduce these transaction costs of smart contracts.
Background Sleep is crucial for child development, especially for children with ASD. While it is known that children with ASD experience more severe sleep problems and that these problems tend to persist compared to their typically developing counterparts, these findings tend to come from only Western countries. A cross-cultural study is important to understand if the prevailing understanding of sleep in children with ASD can be extended to different cultural backgrounds. Aim A cross-cultural study is conducted, involving typically developing children and children with ASD aged 5–12 across two countries: Saudi Arabia and the United Kingdom. Methods and procedures Using a combination of questionnaires measuring ASD severity (CARS-2), sleep quality (CSHQ), sociodemographic and lifestyle variables and sleep diaries, 244 children were sampled using a mixture of snowball and convenience sampling methods. Outcomes and results Children with ASD experience more sleep problems compared to typically developing children in Saudi Arabia, and these problems similarly persist across time. Specifically, it was found that children with ASD in Saudi Arabia experience greater sleep onset latency and a greater number of night awakenings. Additionally, across the ASD groups, it was found that children from Saudi Arabia generally experienced poorer sleep than children in the United Kingdom in terms of shorter sleep duration, although children in the United Kingdom tended to report more instances of sleep anxiety and parasomnias. Conclusions and implications Several reasons such as parental education about sleep hygiene, cultural influences and social hours were put forward as potential explanations for cross-cultural differences. Findings served to emphasise the importance of culturally-appropriate interventions and public education regarding child sleep.
Institution pages aggregate content on ResearchGate related to an institution. The members listed on this page have self-identified as being affiliated with this institution. Publications listed on this page were identified by our algorithms as relating to this institution. This page was not created or approved by the institution. If you represent an institution and have questions about these pages or wish to report inaccurate content, you can contact us here.
8,653 members
• Department of Industrial Engineering
• CIMEC - Center for Mind/Brain Sciences
• Department of Information Engineering and Computer Science
• Department of Civil, Environmental and Mechanical Engineering
Information