Association for Computing Machinery
  • New York City, United States
Recent publications
Computational chemistry is at the forefront of solving urgent societal problems, such as polymer upcycling and carbon capture. The complexity of modeling these processes at appropriate length and time scales is mainly manifested in the number and types of chemical species involved in the reactions and may require models of several thousand atoms and large basis sets to accurately capture the chemical complexity and heterogeneity in the physical and chemical processes. The quantum chemistry package General Atomic and Molecular Electronic Structure System (GAMESS) has a wide array of methods that can efficiently and accurately treat complex chemical systems. In this work, we have used the GAMESS Effective Fragment Molecule Orbital (EFMO) method for electronic structure calculation of a challenging mesoporous silica nanoparticle (MSN) model surrounded by about 4700 water molecules to investigate the strong scaling and GPU offloading on hybrid CPU‐GPU nodes. Experiments were performed on the Perlmutter platform at the National Energy Research Scientific Computing Center. Good strong scaling and load balancing have been observed on up to 88 hybrid nodes for different settings of the execution parameters for the calculation considered here. When GPUs are oversubscribed by offloading work from multiple CPU processes, using the NVIDIA multi‐process service (MPS) has consistently reduced time to solution and energy consumed. Additionally, for some configuration parameter settings, oversubscription with MPS improved performance by up to 5.8% over the case without oversubscription.
Dek A Response to William Bowman's Communications Opinion article, "ACM Profits Considered Harmful".
In this paper, we explore dynamic market share and public healthcare costs of trastuzumab’s evergreening (subcutaneous) variant during introduction of trastuzumab’s competitive biosimilar variants in the Netherlands. We used a time series design to assess dynamic market share of trastuzumab’s evergreening variant after introducing trastuzumab’s biosimilar variants, focusing on the number of treatments and patients. The public healthcare costs of this evergreening strategy were estimated using administrative claims data. Our results show that the original trastuzumab was completely replaced by the subcutaneous and biosimilar variants. The uptake of the subcutaneous form peaked at 50% market share but after the introduction of biosimilars progressively reduced to a market share of 20%, resulting in a more competitive market structure. The public healthcare costs for trastuzumab significantly decreased after the introduction of the biosimilars. After the introduction of the biosimilars, a substantial price drop is visible, with the subcutaneous version, still under patent, also falling sharply in price but less strongly than the iv/biosimilar version. As the costs are publicly funded, we recommend a more explicit societal debate to consider if the potential benefits of subcutaneous Herceptin® (and other similar medicines) are worth the additional costs, and at which price it should be reimbursed as the part of the benefit package.
Lean construction approach to project management incorporates the ideas of Lean thinking and lean principles to minimize waste and add value to the customers. Several firms have reported benefits through enhanced productivity by adopting Lean techniques on project sites worldwide. Some of the popular tools implemented include the last planner system and value stream mapping. While these are popular, there seems to be a dearth of studies that address the eight types of Lean wastes which include defects, overproduction, waiting, motion, unused talent, transportation, inventory, motion, and extra-processing. To this end, an experimental study was conducted across two universities wherein students from NICMAR Pune and Nottingham Trent University, UK, collaborated for a period of two months to identify different types of waste on project sites and explored strategies to minimize them. This experiment aimed to identify and mitigate Lean waste and suggests the path of moving towards zero waste in construction aligned with the United Nations goal for a sustainable development. This particular paper focuses on the ‘defects’ type of Lean waste. Using a fishbone diagram and 5-why analysis, the root cause for defects was analysed, and strategies to overcome defects were suggested. Recommendations included effective construction management including people, processes, advanced technology, and effective training and education in Lean methods and incentivizing good workmanship. The measures were tested on the project sites and validated. The findings of the study are expected to add as a stepping stone to standardize processes that can minimize waste in the construction processes.
To address the issue of non-uniform fiber volume fraction between layers in the compression compaction process of C/C soft-hard mixed preforms, a multi-unit variable duration cyclic compression compaction process based on the inter-laminar fiber compression viscoelastic deformation behavior is proposed. This process aims to gradually eliminate the rebound characteristics of inter-laminar fibers and reduce the error of inter-laminar fiber volume fraction. The mapping relationships between the number of units, holding duration, and compaction times with the rebound height of inter-laminar fibers are established using data fitting. The compression compaction process is determined using the Box Behnken response surface design method, and digital devices are utilized for preform compaction experiments. The micro-morphology of the preform is observed using an optical microscope, and the density of inter-laminar fibers before and after process optimization is compared. Experimental results indicate that when the number of units is 3, the holding duration is 57 s, and the compaction times is 2, the fiber volume fraction of the soft-hard mixed preform is 42.90%, which is 12.16% higher than before process optimization, and the error of inter-laminar fiber volume fraction is less than 6.5%.
Global warming is one of the major issues for decades. Implementation of integrated solution to get sustainable control system has significant challenges. Acceptance of the key precautions and maintenance of sustainability became a bottleneck. The consequence of human social activities increases the consumption of fossil fuels as power resources which indirectly increased the concentration of greenhouse gases and water vapor, in the environment. This causes the earth's average surface temperature to increase. Water vapor is responsible for two-thirds of this issue; however, C02 is proven to be the key factor of global warming. The global temperature could have increased to 3.8 o C in case C02 concentration was doubled. The trend and association between rising temperatures and C02 emissions are shown in this work, along with an explanation of the many methods behind ideal mitigation. One of the goals of this research is to identify the issue area and integrate the climatic data. This will assist in obtaining a dependable control system capable of addressing global warming concerns. This paper outlines how data engineering and analytics may be used to regulate global warming and provides an architecture for the Green Warming Model (GWD). Therefore, the primary contribution of this study effort is environment preservation using data engineering and machine learning (ML).
The ongoing transformation of health systems around the world aims at personalized, preventive, predictive, participative precision medicine, supported by technology. It considers individual health status, conditions, and genetic and genomic dispositions in personal, social, occupational, environmental and behavioral contexts. In this way, it transforms health and social care from art to science by fully understanding the pathology of diseases and turning health and social care from reactive to proactive. The challenge is the understanding and the formal as well as consistent representation of the world of sciences and practices, i.e., of multidisciplinary and dynamic systems in variable context. This enables mapping between the different disciplines, methodologies, perspectives, intentions, languages, etc., as philosophy or cognitive sciences do. The approach requires the deployment of advanced technologies including autonomous systems and artificial intelligence. This poses important ethical and governance challenges. This paper describes the aforementioned transformation of health and social care ecosystems as well as the related challenges and solutions, resulting in a sophisticated, formal reference architecture. This reference architecture provides a system-theoretical, architecture-centric, ontology-based, policy-driven model and framework for designing and managing intelligent and ethical ecosystems in general and health ecosystems in particular.
During the winding process, the filament moves at a high speed with multiple configurations and large deformations, and is acted upon by winding tension, contact force, transverse force, air resistance, and so on. Accurately predicting the trajectory and tension fluctuation of the filament under the high-speed running condition is the basis for regulating winding parameters and ensuring high-quality winding. This paper proposed a novel dynamic approach for modeling the polyester filament winding system. The filament element was established by absolute nodal coordinate formulation. The nonlinear spring and viscous damper elements were used to establish the contact model between the filament and the mechanical parts, and the mechanical model of the influence of the airflow on the filament was established. Through considering the moving filament and all the force factors, the dynamic model of the multi-body coupling system of the filament winding and the corresponding nonlinear dynamic equation were established, and the dynamic equations were solved using MATLAB. An example of the high-speed winding system was simulated and further analyzed, and the simulated trajectory of the moving filament was highly consistent with the experimental image record.
Trust is risky. The mere perception of strategically deceptive behavior that disguises intent or conveys unreliable information can inhibit cooperation. As gregariously social creatures, human beings would have evolved physiologic mechanisms to identify likely defectors in cooperative tasks, though these mechanisms may not cross into conscious awareness. We examined trust and trustworthiness in an ecological valid manner by (i) studying working-age adults, (ii) who make decisions with meaningful stakes, and (iii) permitting participants to discuss their intentions face-to-face prior to making private decisions. In order to identify why people fulfill or renege on their commitments, we measured neurophysiologic responses in blood and with electrodermal activity while participants interacted. Participants (mean age 32) made decisions in a trust game in which they could earn up to $530. Nearly all interactions produced promises to cooperate, although first decision-makers in the trust game reneged on 30.7% of their promises while second decision-makers reneged on 28%. First decision-makers who reneged on a promise had elevated physiologic stress using two measures (the change in adrenocorticotropin hormone and the change in skin conductance levels) during pre-decision communication compared to those who fulfilled their promises and had increased negative affect after their decisions. Neurophysiologic reactivity predicted who would cooperate or defect with 86% accuracy. While self-serving behavior is not rare, those who exhibit it are stressed and unhappy.
Integral imaging, i.e., the use of lenticular optics to display stereoscopic/multiscopic images, is now being used in an array of products including glasses-free 3D displays. This paper describes integral illumination, an adaptation of integral imaging where fine-grained control of plenoptic light fields is used to realize new forms of programmable lighting. Relying on a combination of an imaging apparatus and custom lenticular optics, integral illumination devices can produce high-fidelity illusions of real and imagined light sources (e.g., spotlight, chandelier), replicating their illumination effects. Such devices have potential uses as ambient lighting fixtures, photography/videography equipment, components of artistic installations, etc. The paper will provide a general overview of integral illumination, describing its basic principles, hardware configuration, control mechanism, range of capabilities, and theoretical/practical limitations. We will also present a sample implementation of a working integral illumination device, describe its engineering details, report performance measurements, and discuss possibilities for future improvements and extensions.
During software development and maintenance, defect localization is an essential part of software quality assurance. Even though different techniques have been proposed for defect localization, i.e., information retrieval (IR)-based techniques and spectrum-based techniques, they can only work after the defect has been exposed, which can be too late and costly to adapt to the newly introduced bugs in the daily development. There are also many JIT defect prediction tools that have been proposed to predict the buggy commit. But these tools do not locate the suspicious buggy positions in the buggy commit. To assist developers to detect bugs in time and avoid introducing them, just-in-time (JIT) bug localization techniques have been proposed, which is targeting to locate suspicious buggy code after a change commit has been submitted. In this paper, we propose a novel JIT defect localization approach, named DeepDL (Deep Learning-based defect localization), to locate defect code lines within a defect introducing change. DeepDL employs a neural language model to capture the semantics of the code lines, in this way, the naturalness of each code line can be learned and converted to a suspiciousness score. The core of our DeepDL is a deep learning-based neural language model. We train the neural language model with previous snapshots (history versions) of a project so that it can calculate the naturalness of a piece of code. In its application, for a given new code change, DeepDL automatically assigns a suspiciousness score to each code line and sorts these code lines in descending order of this score. The code lines at the top of the list are considered as potential defect locations. Our tool can assist developers efficiently check buggy lines at an early stage, which is able to reduce the risk of introducing bugs in time and improve the developers’ confidence in the reliability of their software. We conducted an extensive experiment on 14 open source Java projects with a total of 11,615 buggy changes. We evaluate the experimental results considering four evaluation metrics. The experimental results show that our method outperforms the state-of-the-art by a substantial margin.
Institution pages aggregate content on ResearchGate related to an institution. The members listed on this page have self-identified as being affiliated with this institution. Publications listed on this page were identified by our algorithms as relating to this institution. This page was not created or approved by the institution. If you represent an institution and have questions about these pages or wish to report inaccurate content, you can contact us here.
Information
Address
New York City, United States