Recent publications
A Collaborative Artificial Intelligence System (CAIS) is a cyber-physical system that learns actions in collaboration with humans in a shared environment to achieve a common goal. In particular, a CAIS is equipped with an AI model to support the decision-making process of this collaboration. When an event degrades the performance of CAIS (i.e., a disruptive event), this decision-making process may be hampered or even stopped. Thus, it is of paramount importance to monitor the learning of the AI model, and eventually support its decision-making process in such circumstances. This paper introduces a new methodology to automatically support the decision-making process in CAIS when the system experiences performance degradation after a disruptive event. To this aim, we develop a framework that consists of three components: one manages or simulates CAIS’s environment and disruptive events, the second automates the decision-making process, and the third provides a visual analysis of CAIS behavior. Overall, our framework automatically monitors the decision-making process, intervenes whenever a performance degradation occurs, and recommends the next action. We demonstrate our framework by implementing an example with a real-world collaborative robot, where the framework recommends the next action that balances between minimizing the recovery time (i.e., resilience), and minimizing the energy adverse effects (i.e., greenness).
Motivation: Technical debt is a metaphor that describes not-quite-right code introduced for short-term needs. Developers are aware of it and admit it in source code comments, which is called Self-Admitted Technical Debt (SATD). Therefore, SATD indicates weak code that developers are aware of. Problem statement: Inspecting source code is time-consuming; automatically inspecting source code for its vulnerabilities is a crucial aspect of developing software. It helps practitioners reduce the time-consuming process and focus on vulnerable aspects of the source code. Proposal: Accurately identify and better understand the semantics of self-admitted technical debt (SATD) by leveraging NLP and NL-PL approaches to detect vulnerabilities and the related SATD. Finally, a CI/CD pipeline will be proposed to make the vulnerability discovery process easily accessible to practitioners.
Process mining is one of the research disciplines belonging to the field of Business Process Management (BPM). The central idea of process mining is to use real process execution logs in order to discover, model, and improve business processes. There are multiple approaches to modeling processes with the most prevalent one being the procedural models like Petri nets and BPMN models. However, procedural models can be difficult to use for processes like software processes that are highly variable and can have a high number of different branches and exceptions. In these cases, it may be better to use declarative models, because declarative models do not aim to model the end-to-end process step by step, but they constrain the behavior of the process using rules thus allowing for more flexibility in the process executions. The goal of this paper is to introduce the main principles of declarative process mining (i.e., process mining based on declarative models) and to show which state-of-the-art declarative process mining techniques have been implemented in the RuM toolkit and in the Declare4Py Python library.
This study explores the performance tuning of flexible InGaZnO (IGZO) thin-film transistors (TFTs) using a double-gate configuration. DC analysis on individually controllable double-gate TFTs highlights that the bottom-gate biasing is highly effective in facilitating efficient switching of the devices, whereas the top-gate biasing allows for controlling their performance. This is demonstrated for the ac response of the devices with different channel lengths showing the tunability of
$\textit{f}_{\,\text{T}}$
and
$\textit{f}_{\,\text{MAX}}$
with a maximum relative tuning up to 130% for
$\textit{f}_{\,\text{T}}$
and 170% for
$\textit{f}_{\,\text{MAX}}$
. A more efficient control is observed for longer TFTs, resulting in increased characteristics frequency up to 50%. Furthermore, the effect of the performance tunability is also reported even when the double-gate TFTs are exposed to tensile strain induced by a bending radius of 2 mm. These findings indicate new possibilities for the design of flexible analog systems with dynamically adjustable performance.
Obesity has risen dramatically in the United States since the 1980s, but incidence varies across demographic groups. We investigate the potential role of economic insecurity—defined, roughly, as the extent to which a household faces the threat of catastrophic income loss–in explaining these changes. We construct a synthetic panel of demographic groups for 1988–2016 by combining the Economic Security Index (which measures the probability of a year‐on‐year drop in adjusted household income of 25% or more) with data from the NHANES surveys. This gives us a plausibly exogenous group‐level measure of economic insecurity while allowing us to control for both individual characteristics and various interactions of group and year fixed effects. We find robust evidence of a link between economic insecurity and obesity, suggesting a nearly one‐to‐one correspondence in percentage point changes in ESI and obesity, for both men and women. We further show that if we instead measure economic insecurity based on the changing occupational exposure of each demographic group to trade with China over time, we find similar qualitative results for men, but not for women. Taken together, these results are supportive of a causal interpretation of our findings.
This paper provides an in-depth analysis of the key effective factors in selecting purchasing channels for cosmetic products. The purpose of this research is to investigate customer perceptions of cosmetics shopping. By synthesizing insights from prior research, this research develops a conceptual model by integrating innovations diffusion theory and theory of perceived risk to apply in preference to online shopping. 435 questionnaires were gathered, and data were analysed with the partial least squared structural equation modelling (PLS-SEM) method. The results show that online channels are the most popular purchasing channels for cosmetics and clarify the most significant factors that lead to cosmetic online shopping.
The extended Aharonov–Bohm electrodynamics has a simple formal structure and allows to couple the e.m. field also to currents which are not locally conserved, like those resulting from certain non-local effective quantum models of condensed matter. As it often happens in physics and mathematics when one tries to extend the validity of some equations or operations, new perspectives emerge in comparison to Maxwell theory, and some “exotic” phenomena are predicted. For the Aharonov–Bohm theory the main new feature is that the potentials $$A^\mu $$ A μ become univocally defined and can be measured with probes in which the “extra-current” $$I=\partial _\mu j^\mu $$ I = ∂ μ j μ is not zero at some points. As a consequence, it is possible in principle to detect pure gauge-waves with $${\textbf{E}}={\textbf{B}}=0$$ E = B = 0 , which would be regarded as non-physical in the Maxwell gauge-invariant theory with local current conservation. We discuss in detail the theoretical aspects of this phenomenon and propose an experimental realization of the detectors. A full treatment of wave propagation in anomalous media with extra-currents and of energy–momentum balance issues is also given.
The field of tumor phylogenetics focuses on studying the differences within cancer cell populations. Many efforts are done within the scientific community to build cancer progression models trying to understand the heterogeneity of such diseases. These models are highly dependent on the kind of data used for their construction, therefore, as the experimental technologies evolve, it is of major importance to exploit their peculiarities. In this work we describe a cancer progression model based on Single Cell DNA Sequencing data. When constructing the model, we focus on tailoring the formalism on the specificity of the data. We operate by defining a minimal set of assumptions needed to reconstruct a flexible DAG structured model, capable of identifying progression beyond the limitation of the infinite site assumption. Our proposal is conservative in the sense that we aim to neither discard nor infer knowledge which is not represented in the data. We provide simulations and analytical results to show the features of our model, test it on real data, show how it can be integrated with other approaches to cope with input noise. Moreover, our framework can be exploited to produce simulated data that follows our theoretical assumptions. Finally, we provide an open source R implementation of our approach, called CIMICE, that is publicly available on BioConductor.
Lagoecia cuminoides L. is a very rare and threatened taxon in Italy, never studied before for its ecology and potential use for human consumption. Furthermore, few data are available on the biological activities of its metabolites. A phytosociological study was carried out in the only two Italian sites, and its state of conservation was also evaluated according to the IUCN (International Union for Conservation of Nature) protocol. The collected plant material was used to make two types of extracts: hot water infusion to evaluate the use of this plant as tea and hydroalcoholic extraction to evaluate the use of it in herbal liqueur preparation. The presence of functional compounds in the extracts were investigated by gas and liquid chromatography coupled to mass spectrometry techniques. Ten non volatiles compounds were identified in the extracts, most of which derivatives of quercetin. Thirty-five volatiles compounds were also identified in the plant aerial part and extracts belonging to the chemical class of terpenoids, and among them β-farnesene, thymol, γ-terpinene and p-cymene were the most abundant. The species is characterized by compounds known for their health effects and for its potential applications for human consumption, being this species already used as decoction in some countries of Middle East. Thanks to its characteristic behaviour to grow in limiting pedoclimatic conditions this species can be potentially used in organic farms situated in rural marginal areas.
Monists and pluralists disagree concerning how many ordinary objects there are in a single situation. For instance, pluralists argue that a statue and the clay it is made of have different properties, and thereby are different. The standard monist's response is to hold that there is just a single object, and that, under the description "being a statue", this object is, e.g., aesthetically valuable, and that, under the description "being a piece of clay", it is not aesthetically valuable. However, Fine provided an ontological reading of the expression "an object under a descrip-tion": the theory of rigid embodiments. The debate between monists and pluralists reduplicates in the domain of ordinary occurrences, like walks and conferences. Specifically, they disagree whether an occurrence in progress (also called "process") like John's walk that is happening at t n is identical to some completed occurrence (also called "event") like John's walk that happened between, e.g., t 1 and t n. Under the adoption of the pluralist's position, the article aims to provide a novel theory of ordinary occurrences that develops the ontological reading of "under a descrip-tion" to account for occurrences in progress and completed occurrences. As a first result, we formulate a theory according to which both occurrences in progress and completed occurrences are rigid embodiments. As a second result, we argue that the suggested theory is explanatorily powerful to the extent it solves two puzzles that we call "the Puzzle from the Completion of a Process" and "the Metaphysical-cum-Semantical Puzzle".
Seen from an international perspective, the professionalisation of (prospective) teachers for inclusive education is a main determining factor for the inclusion-related further development of schools and classes in various subjects. However, the prerequisites for acquiring inclusion-related expertise and professionalism for designing inclusive teaching vary greatly from country to country. This is especially evident in the stratified secondary level of the education system in Germany. In this paper, we draw on the results of a study on the description and analysis with respect to the acquisition of expertise for inclusive mathematics teaching. Guided interviews were conducted with experienced mathematics teachers in inclusive teaching of the secondary level and analysed with qualitative methods. Among other elements, concrete tasks, assignments, and video vignettes from teaching were used as narrative-generating prompts. This paper elaborates on how the expertise and professionalism for inclusive mathematics teaching gained under ambivalent conditions can be described and what indications can be found for its acquisition. It pays special attention to the reflexive evocation and handling of contingencies as key moments for the further development of a teacher’s professionalism on inclusive mathematics teaching.
Computation at the edge or within the Internet-of-Things (IoT) requires the use of controllers to make the management of resources in this setting self-adaptive. Controllers are software that observe a system, analyse its quality and recommend and enact decisions to maintain or improve quality. Today, often reinforcement learning (RL) that operates on a notion of reward is used to construct these controllers. Here, we investigate quality metrics and quality management processes for RL-constructed controllers for edge and IoT settings. We introduce RL and control principles and define a quality-oriented controller reference architecture. This forms the based for the central contribution, a quality analysis metrics framework, embedded into a quality management process.
In many Web and Internet-based systems, sharing Personally Identifiable Information (PII) to identify persons and other entities is common, but centralized systems such as central registries have limitations in terms of control of privacy and identity that a decentralized identity management architecture could address. This study aims to compare the current and potential systems, analyze protocols for decentralized identification and data exchange, propose a protocol selection method, and provide a simple code example. The goal is to assess the feasibility of decentralized processes in software-based business workflows. The methodology involves reviewing protocol materials, including white-papers, articles, and code docs, alongside ontological aspects of identification. Challenges to implementing Decentralized Identifiers (DIDs) include interoperability and the evolving Web/Internet landscape towards more decentralization, openness, and greater user utility.
Family managers' entrepreneurial intentions (EI) play a crucial role in the long-term success of family firms. Previous research has highlighted education as a key driver of EI but has failed to consider the unique socialization processes within business families and their impact on the education-EI relationship. This study aims to fill this gap by examining the direct and indirect effects of education on family managers' EI. By combining the integrated model of EI and research on business families' socialization patterns, a study was conducted with a role-playing experimental design involving 412 family firm managers. The results indicate that entrepreneurial self-efficacy (ESE) serves as a mediator between education and EI, while the ESE-EI relationship is further mediated by risk perceptions. Interestingly, no direct effect of education on EI was found, suggesting that the influence of education on EI follows distinct patterns within business families.
Introduction: Forests are threatened by increasingly severe and more frequent drought events worldwide. Mono-specific forests, developed as a consequence of widespread management ractices established early last century, seem particularly susceptible to global warming and drought compared with mixed species
forests. Although, in several contexts, mixed-species forests display higher species diversity, higher productivity, and higher resilience, previous studies highlighted contrasting findings, with not only many positive but also neutral or negative effects on tree performance that could be related to tree species diversity. Processes underlying this relationship need to be investigated.
Wood anatomical traits are informative proxies of tree functioning, and they can potentially provide novel long-term insights in this regard. However, wood anatomical traits are critically understudied in such a context. Here, we assess the role of tree admixture on Pinus sylvestris L. xylem traits such as mean
hydraulic diameter, cell wall thickness, and anatomical wood density, and we test the variability of these traits in response to climatic parameters such as temperature, precipitation, and drought event frequency and intensity.
Methods: Three monocultural plots of P. sylvestris and three mixed-stand plots of P. sylvestris and Quercus sp. were identified in Poland and Spain, representing Continental and Mediterranean climate types, respectively. In each plot, we analyzed xylem traits from three P. sylvestris trees, for a total of nine trees in monocultures and nine in mixed stands per study location.
Results: The results highlighted that anatomical wood density was one of the most sensitive traits to detect tree responses to climatic conditions and drought under different climate and forest types. Inter-specific facilitation mechanisms were detected in the admixture between P. sylvestris and Quercus sp., especially
during the early growing season and during stressful events such as spring droughts, although they had negligible effects in the late growing season.
Discussion: Our findings suggest that the admixture between P. sylvestris and Quercus sp. increases the resilience of P. sylvestris to extreme droughts. In a global warming scenario, this admixture could represent a useful adaptive management option.
Zusammenfassung
Vor dem Hintergrund der weltweit voranschreitenden Urbanisierung und damit der zunehmenden Bedeutung von Städten als Lebensraum für den Menschen ist es unabdingbar, die Gesundheit und das Wohlbefinden der Stadtbewohner:innen in den Mittelpunkt urbaner Planungen zu rücken. Das Forschungsförderprogramm „Stadt der Zukunft – gesunde und nachhaltige Metropolen“ der Fritz und Hildegard Berg-Stiftung im Deutschen Stiftungszentrum setzt hierbei seit 2010 wichtige inter- und transdisziplinäre Forschungsimpulse. Gefördert wurden Juniorforschungsgruppen an den Schnittstellen von Gesundheitswissenschaften und anderen Disziplinen der Natur- und Sozialwissenschaften. Ein Schwerpunkt der Forschungsförderung lag auch auf der intensiven Kooperation mit der Praxis, um einerseits die Probleme und Fragestellungen aus der Praxis wissenschaftlich optimal reflektieren und bearbeiten zu können und andererseits den Wissenstransfer zu katalysieren. Begleitet werden die Forschungskonsortien durch alljährlich stattfindende Konferenzen mit den unterschiedlichsten Themen der StadtGesundheit an wechselnden Veranstaltungsorten in Deutschland. Zudem sind transdisziplinäre Netzwerke geknüpft und weitere Initiativen (z. B. Institutsgründung) angestoßen worden. Die Entwicklung des facettenreichen Forschungsförderprogramms „Stadt der Zukunft – gesunde und nachhaltige Metropolen“ wird dargestellt und ein Ausblick in dessen zukünftige Entwicklung präsentiert.
Graphs are used as a model of complex relationships among data in biological science since the advent of systems biology in the early 2000. In particular, graph data analysis and graph data mining play an important role in biology interaction networks, where recent techniques of artificial intelligence, usually employed in other type of networks (e.g., social, citations, and trademark networks) aim to implement various data mining tasks including classification, clustering, recommendation, anomaly detection, and link prediction. The commitment and efforts of artificial intelligence research in network biology are motivated by the fact that machine learning techniques are often prohibitively computational demanding, low parallelizable, and ultimately inapplicable, since biological network of realistic size is a large system, which is characterised by a high density of interactions and often with a non-linear dynamics and a non-Euclidean latent geometry. Currently, graph embedding emerges as the new learning paradigm that shifts the tasks of building complex models for classification, clustering, and link prediction to learning an informative representation of the graph data in a vector space so that many graph mining and learning tasks can be more easily performed by employing efficient non-iterative traditional models (e.g., a linear support vector machine for the classification task). The great potential of graph embedding is the main reason of the flourishing of studies in this area and, in particular, the artificial intelligence learning techniques. In this mini review, we give a comprehensive summary of the main graph embedding algorithms in light of the recent burgeoning interest in geometric deep learning.
Zusammenfassung
Um die urbanen Gesundheitsrisiken des Klimawandels nachhaltig zu reduzieren und zu managen, sind Klimaschutz und Klimaanpassung als komplementäre Strategien dringend erforderlich. Seit Jahrzehnten sind vielfältige positive Wirkungen von Stadtgrün und Stadtblau auf die physische und die mentale Gesundheit bekannt. Allerdings gibt es in den meisten Städten eine intensive Konkurrenz um die Nutzung von Flächen. Im Sinne der europäischen Aalborg-Charta von 1994 verlangt das deutsche Baurecht in diesem anspruchsvollen Kontext, dass Bauleitpläne eine nachhaltige städtebauliche Entwicklung gewährleisten sollen. Die menschliche Gesundheit ist dabei ein Belang von zentraler Bedeutung. Die Realität bleibt aber herausfordernd: Zwar gibt es inzwischen eine ganze Reihe von best practice-Beispielen. Nach wie vor besteht aber ein großer Forschungsbedarf zur Bedeutung urbaner Grün- und Blauräume für die menschliche Gesundheit und das gesundheitliche Wohlbefinden. Zudem müssen alle relevanten Politikfelder und Verwaltungseinheiten ein Bewusstsein für die Bedeutung von Grün und Blau für urbane Lebensqualität und Gesundheit entwickeln, um „Gesundheit“ in städtischen Entscheidungsprozessen adäquat und sozial sensibel zu berücksichtigen.
In the last years, a number of initiatives have developed and deployed new electronic voting (e-voting) systems, with the goal of improving the paper-based electoral systems. Traditional e-voting systems are vulnerable to manipulation concerns, such as altering election outcomes, due to their centralized nature. Blockchain is a decentralized system technology with a distributed ledger in which every participant in the network accesses the same data source. The immutability of blockchains makes them suited for e-voting systems as no one can alter data. Although blockchain technology has potential for e-voting systems, there are challenges, including the scalability for large-scale elections.
We present a solution to scalability of e-voting systems using the Solana blockchain. Solana provides a fast and scalable blockchain infrastructure utilizing the Proof of History (PoH) consensus algorithm, making it easier for developers to create efficient decentralized applications. The evaluation of factors such as throughput, delay, and cost shows that a Solana implementation can finalize a vote 32 times faster than Ethereum 2.0, with a significantly lower cost per vote. The throughput of e-voting app on Solana indicates that it can deal with a large scale election entirely.
The Arbitrary Lagrangian–Eulerian Smoothed Particle Hydrodynamics (ALE-SPH) formulation can guarantee stable solutions preventing the adoption of empirical parameters such as artificial viscosity. However, the convergence rate of the ALE-SPH formulation is still limited by the inaccuracy of the SPH spatial operators. In this work, a Weighted Essentially Non-Oscillatory (WENO) spatial reconstruction is then adopted to minimise the numerical diffusion introduced by the approximate Riemann solver (which ensures stability), in combination with two alternative approaches to restore the consistency of the scheme: corrected divergence SPH operators and the particle regularisation guaranteed by the correction of the transport velocity. The present work has been developed in the framework of the DualSPHysics open-source code. The beneficial effect of the WENO reconstruction to reduce numerical diffusion in ALE-SPH schemes is first confirmed by analysing the propagation of a small pressure perturbation in a fluid initially at rest. With the aid of a 2-D vortex test case, it is then demonstrated that the two aforementioned techniques to restore consistency effectively reduce saturation in the convergence to the analytical solution. Moreover, high-order (above second) convergence is achieved. Yet, the presented scheme is tested by means of a circular blast wave problem to demonstrate that the restoration of consistency is a key feature to guarantee accuracy even in the presence of a discontinuous pressure field. Finally, a standing wave has been reproduced with the aim of assessing the capability of the proposed approach to simulate free-surface flows.
Institution pages aggregate content on ResearchGate related to an institution. The members listed on this page have self-identified as being affiliated with this institution. Publications listed on this page were identified by our algorithms as relating to this institution. This page was not created or approved by the institution. If you represent an institution and have questions about these pages or wish to report inaccurate content, you can contact us here.
Information
Address
Piazza Universita' 1, 39100, Bolzano, Italy
Head of institution
Prof. Ulrike Tappeiner
Website
http://www.unibz.it
Phone
+3904712012100