Recent publications
Grasping is at the core of many robotic manipulation tasks. Despite the recent progress, closed-loop grasp planning in stacked scenes is still unsatisfactory, in terms of efficiency, stability, and most importantly, safety. In this paper, we present CSGP, a closed-loop safe grasp planning approach via attention-based deep reinforcement learning (DRL) from demonstrations, which is capable of learning safe grasping policies that make surrounding objects less disturbed or damaged during manipulation. In CSGP, a 6-DoF safe grasping policy network with a Next-Best-Region attention module is proposed to intrinsically identify the safe regions in the view, facilitating the learning of safe grasping actions. Moreover, we design a fully automatic pipeline in the simulator to collect safe grasping demonstrations, which are utilized to pre-train the policy with behavior cloning and fine-tune it with DRL. To effectively and stably improve the policy during fine-tuning, a DRL from demonstrations algorithm named Safe-DDPGfD is presented in CSGP with a truncated height-anneal exploration mechanism for safe exploration. Moreover, we provide a benchmark that contains scenes with multiple levels of stack layers for method evaluation. Simulation results demonstrate the state-of-the-art performance of our method, achieving the Overall score of
$88\%$
in our benchmark. Also, real-world robot grasping experiments also show the effectiveness of our method.
Background and importance:
Although hypoperfusion of the basal ganglia or the frontal subcortical matter is suspected, the pathology of chorea in moyamoya disease remains unclarified. Herein, we report a case of moyamoya disease presenting with hemichorea and evaluate pre- and postoperative perfusion using single photon emission computed tomography with N-isopropyl-p-123I-iodoamphetamine (123I-IMP SPECT).
Clinical presentation:
An 18-year-old woman presented with choreic movement of her left limbs. Magnetic resonance imaging revealed an ivy sign, and 123I-IMP SPECT demonstrated decreased cerebral blood flow (CBF) and cerebral vascular reserve (CVR) values in the right hemisphere. The patient underwent direct and indirect revascularization surgery to improve cerebral hemodynamic impairment. The choreic movements entirely resolved immediately after surgery. Although CBF and CVR values in the ipsilateral hemisphere demonstrated by quantitative SPECT increased, these did not reach the normal values threshold.
Conclusion:
Choreic movement in moyamoya disease may be related to cerebral hemodynamic impairment. Further studies are required to elucidate its pathophysiological mechanisms.
NVMe SSD hugely boosts the I/O speed, with up to GB/s throughput and microsecond-level latency. Unfortunately, DBMS users can often find their high-performanced storage devices tend to deliver less-than-expected or even worse performance when compared to their traditional peers. While many works focus on proposing new DBMS designs to fully exploit NVMe SSDs, few systematically study the symptoms, root causes and possible detection methods of such performance mismatches on existing databases.
In this paper, we start with an empirical study where we systematically expose and analyze the performance mismatches on six popular databases via controlled configuration tuning. From the study, we find that all six databases can suffer from performance mismatches. Moreover, we conclude that the root causes can be categorized as databases' unawareness of new storage devices characteristics in I/O size, I/O parallelism and I/O sequentiality. We report 17 mismatches to developers and 15 are confirmed.
Additionally, we realize testing all configuration knobs yields low efficiency. Therefore, we propose a fast performance mismatch detection framework and evaluation shows that our framework brings two orders of magnitude speedup than baseline without sacrificing effectiveness.
Plain Language Summary
Accurate descriptions of raindrop shapes rely on complex models since they are affected by multiple factors, such as gravity, air resistance and surface tension. Despite this intricacy, many observational studies have suggested that raindrops can be simply approximated by oblate spheroids characterized by an axis ratio (ratio of the minor to major axes, AR). This lays the basis for polarimetric radars which quantitatively measure this asymmetry for improved rainfall remote sensing. However, our current understanding of ARs is based on 2‐Dimensional (2D) projections of raindrops, and it is not clear how well this 2D method represents the realistic ARs of the 3‐Dimensional (3D) shape of raindrops. In this study, we use orthogonal observations of 2DVD to reconstruct the 3D shape of raindrops and quantify the biases of using a single camera and the 2DVD raw products. The results show that the traditional 2D methods overestimate the ARs of large raindrops, which can lead to biased differential reflectivity, and the ensuing derived quantitative precipitation estimation algorithm is also subject to serious errors. We demonstrate the necessity of considering “real” ARs in polarimetric radar simulations and believe that the presented 3D reconstruction AR parameterization should be used in future studies.
As technology continuously shrinks, radiation-induced soft errors have become a great threat to the circuit reliability. Among all the causes, Single-Event-Transient (SET) is the dominating one for the radiation-induced soft errors. SET-induced soft errors can be mitigated by multiple methods. In terms of area and power overhead, blocking SET propagation is considered to be the most efficient way for soft error reduction. It is found that the SET pulse width can be shrunk by pulse quenching effect, which can be utilized to mitigate soft errors without introducing any area and power overhead. In this paper, we present an effective detailed placer to exploit pulse quenching effect for soft error reduction in combinational circuits. In our method, the quenching effect enhancement is globally optimized while the cell displacement is minimized. The experimental results demonstrate that our method reduces the soft error vulnerability of the circuits by \(29.53\% \) vs. \(18.38\% \) of the state-of-the-art solution. Meanwhile, our method has a minimal effect on the displacement and half-perimeter wire-length (HPWL) compared to the previous solutions, which means a minimum timing influence to the original design.
Favipiravir and remdesivir are drugs to treat COVID-19. This study aims to find an optimum and validated method for simultaneous analysis of favipiravir and remdesivir in Volumetric Absorptive Microsampling (VAMS) by Ultra High-Performance Liquid Chromatography–Tandem Mass Spectrophotometry. The use of VAMS can be an advantage because the volume of blood is small and the sample preparation process is simple. Sample preparation was done by precipitation of protein using 500 μL of methanol. Analysis was carried out by ultra high-performance liquid chromatography–tandem mass spectrophotometry with ESI+ and MRM with m/z 157.9 > 112.92 for favipiravir, 603.09 > 200.005 for remdesivir, and at m/z 225.968 > 151.991 for acyclovir as the internal standard. The separation was carried out using an Acquity UPLC BEH C 18 column (100 × 2.1 mm; 1.7 m), 0.2% formic acid—acetonitrile (50:50), flow rate was 0.15 mL/min, and column temperature was 50°C. The analytical method has been validated with the requirements issued by the Food and Drug Administration (2018) and European Medicine Agency (2011). The calibration range of favipiravir is 0.5–160 μg/mL and 0.002–8 μg/mL for remdesivir.
Using Climate Forecast System Reanalysis (CFSR) data and numerical simulations, the impacts of the multi‐scale sea surface temperature (SST) anomalies in the North Pacific on the boreal winter atmospheric circulations are investigated. The basin‐scale SST anomaly as the Pacific Decadal Oscillation (PDO) pattern, a narrow meridional band of frontal‐scale smoothed SST anomaly in the subtropical front zone (STFZ) and the spatial dispersed eddy‐scale SST anomalies within the STFZ are the three types of forcings. The results of statistical methods find that all three oceanic forcings may correspond to the winter North Pacific jet changing with the similar pattern. Furthermore, several atmospheric general circulation model simulations are used to reveal the differences and detail processes of the three forcings. The basin‐scale cold PDO‐pattern SST anomaly first causes negative turbulent heat flux anomalies, atmospheric cooling, and wind deceleration in the lower atmosphere. Subsequently, the cooling temperature with an amplified southern lower temperature gradient and baroclinity brings a lagging middle warming because of the enhanced atmospheric eddy heat transport. The poleward and upward development of baroclinic fluctuations eventually causes the acceleration of the upper jet. The smoothed frontal‐ and eddy‐scales SST anomalies in the STFZ cause comparable anomalous jet as the basin‐scale by changing the upward baroclinic energy and Eliassen‐Palm fluxes. The forcing effects of multi‐scales SST anomalies coexist simultaneously in the mid‐latitude North Pacific, which can cause similar anomalous upper atmospheric circulations. This is probably why it is tricky to define the certain oceanic forcing to the specific observed atmospheric circulation variation.
Drones have drawn considerable attention as the agents in wireless data collection for agricultural applications, by virtue of their three-dimensional mobility and dominant line-of-sight communication channels. Existing works mainly exploit dedicated drones via deployment and maintenance, which is insufficient regarding resource and cost-efficiency. In contrast, leveraging existing delivery drones for the data collection on their way of delivery, called delivery drones’ piggybacking , is a promising solution. For achieving such cost-efficiency, drone scheduling inevitably stands in front, but the delivery missions involved have escalated it to a wholly different and unexplored problem. As an attempt, we first survey 514 delivery workers and conduct field experiments; noticeably, the collection cost, which mostly comes from the energy consumption of drones’ piggybacking, is determined by the decisions on package-route scheduling and data collection time distribution . Based on such findings, we build a new model that jointly optimizes these two decisions to maximize data collection amount, subject to the collection budget and delivery constraints. Further model analysis finds it a Mixed Integer Non-Linear Programming problem, which is NP-hard. The major challenge stems from interdependence entangling the two decisions. For this point, we propose Delta , a \(\frac{1}{9+\delta } \) -approximation delivery drone scheduling algorithm. The key idea is to devise an approximate collection time distribution scheme leveraging energy slicing, which transforms the complex problem with two interdependent variables into a submodular function maximization problem only with one variable . The theoretical proofs and extensive evaluations verify the effectiveness and the near-optimal performance of Delta .
In order to investigate the formation mechanism of η phase (Fe2Al5 phase) during aluminizing process, a so-called marker experiment has been developed in this study. Angular and small MgO particles have been placed firmly on the surface of iron plate. Using iron-saturated aluminum melt bath, which was kept at 700 ºC, an iron specimen was dipped for 11 minutes. Then the cross section was metallographically analyzed. MgO particles were found on the boundary between the Al-Fe alloy region and η phase. This fact indicated that Al diffuses into the α-iron but Fe does not diffuse out into the aluminum melt. It also indicated that the formation site of η phase is on the η/α-iron interface. Since the thickness of η phase is proportional to the square root of dipping time, this process is controlled by the volume diffusion of Al atoms in the η phase. It also found that the thickness of iron plate increased with increasing dipping time. This can be explained semi-quantitatively taking into account of the change in the phase and volume between α-iron and η phase.
Fullsize Image
Increasing artwork plagiarism incidents underscores the urgent need for reliable copyright protection for high-quality artwork images. Although watermarking is helpful to this issue, existing methods are limited in imperceptibility and robustness. To provide high-level protection for valuable artwork images, we propose a novel invisible robust watermarking framework, dubbed as IRWArt. In our architecture, the embedding and recovery of the watermark are treated as a pair of image transformations' inverse problems, and can be implemented through the forward and backward processes of an invertible neural networks (INN), respectively. For high visual quality, we embed the watermark in high-frequency domains with minimal impact on artwork and supervise image reconstruction using a human visual system(HVS)-consistent deep perceptual loss. For strong plagiarism-resistant, we construct a quality enhancement module for the embedded image against possible distortions caused by plagiarism actions. Moreover, the two-stage&contrastive training strategy enables the simultaneous realization of the above two goals. Experimental results on 4 datasets demonstrate the superiority of our IRWArt over other state-of-the-art watermarking methods. Code: https://github.com/1024yy/IRWArt.
Quick and accurate detection of inside packet drop attackers is of critical importance to reduce the damage they can have on the network. Trust mechanisms have been widely used in wireless sensor networks for this purpose. However, existing trust models are not effective because they cannot distinguish between packet drops caused by an attack and those caused by normal network failure. We observe that insider packet drop attacks will cause more consecutive packet drops than a network abnormality. Therefore, we propose the use of consecutive packet drops to speed up the detection of inside packet drop attackers. In this article, we describe a new trust model based on consecutive drops and develop a hybrid trust mechanism to seamlessly integrate the new trust model with existing trust models. We perform extensive OPNET (Optimized Network Engineering Tool) simulations using a geographic greedy routing protocol to validate the effectiveness of our new model. The simulation results show that our hybrid trust model outperforms existing trust models for all types of inside packet drop attacks, not only in terms of detection speed and accuracy as it is designed for, but also in terms of other important network performance metrics, such as packet delivery rate, routing reliability, and energy efficiency.
A globally aging population results in the long-term care of people with chronic illnesses, affecting the living quality of the elderly. Integrating smart technology and long-term care services will enhance and maximize healthcare quality, while planning a smart long-term care information strategy could satisfy the variety of care demands regarding hospitals, home-care institutions, and communities. The evaluation of a smart long-term care information strategy is necessary to develop smart long-term care technology. This study applies a hybrid Multi-Criteria Decision-Making (MCDM) method, which uses the Decision-Making Trial and Evaluation Laboratory (DEMATEL) integrated with the Analytic Network Process (ANP) for ranking and priority of a smart long-term care information strategy. In addition, this study considers the various resource constraints (budget, network platform cost, training time, labor cost-saving ratio, and information transmission efficiency) into the Zero–one Goal Programming (ZOGP) model to capture the optimal smart long-term care information strategy portfolios. The results of this study indicate that a hybrid MCDM decision model can provide decision-makers with the optimal service platform selection for a smart long-term care information strategy that can maximize information service benefits and allocate constrained resources most efficiently.
Decision-making quality is a concern for the public sector because its decisions may involve more dimensions than those faced by the private sector. However, past studies on public sector efficiency have ignored the uncertainty of data estimation. This study addresses this shortcoming by proposing a chance-constrained network DEA approach based on an enhanced Russell-based directional distance measure that evaluates public sector performance. The proposed approach uses stochastic inputs and outputs to enhance decision-making quality and capacity in the public sector. The usefulness of this approach is demonstrated through an empirical case of the OECD. Our approach also provides practical suggestions for promoting a green economic transformation and serves as a reference for government policies.
Institution pages aggregate content on ResearchGate related to an institution. The members listed on this page have self-identified as being affiliated with this institution. Publications listed on this page were identified by our algorithms as relating to this institution. This page was not created or approved by the institution. If you represent an institution and have questions about these pages or wish to report inaccurate content, you can contact us here.
Information
Address
Washington, D.C., United States