Deloitte & Touche Llp
  • New York City, NY, United States
Recent publications
This paper analyzes the effectiveness of government subsidies in promoting product diversity when a downstream firm has buyer power. Using an extension of the Dixit‐Stiglitz model of monopolistic competition, we compare the effects of subsidies on the equilibrium number of differentiated products and social welfare in the case where products are sold directly to consumers versus the case where they are distributed through a monopoly retailer with buyer power. We find that a production subsidy promotes product diversity in both cases, but the mechanisms through which a subsidy raises the number of products are different. Compared with the case where products are distributed directly to consumers, retailer buyer power reduces product diversity and social welfare. Furthermore, it weakens the effectiveness of the subsidy in promoting product diversity. At any given subsidy rate the equilibrium number of products is smaller, and a rise in the subsidy rate leads to a smaller increase in the number of products.
The Italian rice agroecosystem plays a key role in the European production and provides a unique range of rice varieties. As productive man-made wetlands, rice paddies are strategic and economic components in the habitat provision for migratory wildlife at the European scale. However, the characteristic of being a “temporary wetland” causes the creation of an ecological trap for a number of living organisms. For this reason, agricultural practices adopted for the management of rice paddies are essential to move towards more sustainable cultivations capable of promoting biodiversity and to minimising negative environmental impacts. This study proposes an ecologically-oriented strategy to implement a circular and self-regulating farming system designed considering the role of constructed wetlands in providing ecosystem services in rice agroecosystems. It demonstrates the economic feasibility and benefits provided by a self-regulating biosystem based on an integrated wetland for a small-size rice farm of the Vercelli province (Piedmont Region, Italy). The study was conducted in collaboration with the rice farm, which already experiments with organic farming techniques. The investigation focuses on the current management structure of the farm and develops an ecologically-oriented business strategy to sustain local biodiversity. This strategy rediscovers and improves the traditional co-culture technique through the development of a permanent pond. It explores the potential benefits generated by the approach, in terms of biodiversity conservation, biological control of pests and weeds and habitat provision for wildlife. The study presents a real case study of economic sustainability of the business strategy through financial analysis. The findings highlight promising economic outcomes compared to the conventional rice cultivation systems. The diversification of marketing strategy and the reduction of operating costs are key factors in the success of the strategy. The ecologically-oriented design methodology presented in this article can easily be applied to other small-scale farms in the agrifood sector.
This study explores the relationship between CFO qualifications and a firm’s internal control weakness (ICWs). We use three measures for CFO competence: financial/accounting background, seniority, and education. Using a sample of Taiwanese listed firms from 2012 to 2015, the results of this study show a negative relationship between CFO financial/accounting background and seniority with internal control weakness, indicating firms with higher quality CFOs experience a lower number of ICWs. CFO education level, however, is not related with a firm’s internal control weakness. The results imply that capable CFOs can effectively implement a good internal system. The result of this study provides practical and policy implications.
Objective: Assess the effectiveness of providing Logical Observation Identifiers Names and Codes (LOINC®)-to-In Vitro Diagnostic (LIVD) coding specification, required by the United States Department of Health and Human Services for SARS-CoV-2 reporting, in medical center laboratories and utilize findings to inform future United States Food and Drug Administration policy on the use of real-world evidence in regulatory decisions. Materials and methods: We compared gaps and similarities between diagnostic test manufacturers' recommended LOINC® codes and the LOINC® codes used in medical center laboratories for the same tests. Results: Five medical centers and three test manufacturers extracted data from laboratory information systems (LIS) for prioritized tests of interest. The data submission ranged from 74 to 532 LOINC® codes per site. Three test manufacturers submitted 15 LIVD catalogs representing 26 distinct devices, 6956 tests, and 686 LOINC® codes. We identified mismatches in how medical centers use LOINC® to encode laboratory tests compared to how test manufacturers encode the same laboratory tests. Of 331 tests available in the LIVD files, 136 (41%) were represented by a mismatched LOINC® code by the medical centers (chi-square 45.0, 4 df, P < .0001). Discussion: The five medical centers and three test manufacturers vary in how they organize, categorize, and store LIS catalog information. This variation impacts data quality and interoperability. Conclusion: The results of the study indicate that providing the LIVD mappings was not sufficient to support laboratory data interoperability. National implementation of LIVD and further efforts to promote laboratory interoperability will require a more comprehensive effort and continuing evaluation and quality control.
The COVID-19 pandemic has posed an unprecedented challenge to public health across the world and has further exposed health disparities and the vulnerability of marginal groups. Since the pandemic has exhibited marked regional differences, it is necessary to better understand the levels of vulnerability to the disease at local levels and provide policymakers with additional tools that will allow them to develop finely targeted policies. In this study, we develop for the State of Alabama (USA) a composite vulnerability index at county level that can be used as a tool that will help in the management of the pandemic. Twenty-four indicators were assigned to the following three categories: exposure, sensitivity, and adaptive capacity. The resulting subindices were aggregated into a composite index that depicts the vulnerability to COVID-19. A multivariate analysis was used to assign factor loadings and weights to indicators, and the results were mapped using Geographic Information Systems. The vulnerability index captured health disparities very well. Many of the most vulnerable counties were found in the Alabama Black Belt region. A deconstruction of the overall index and subindices allowed the development of individual county profiles and the detection of local strengths and weaknesses. We expect the model developed in this study to be an efficient planning tool for decision-makers.
Automatic building segmentation from satellite images is an important task for variety of applications such as urban mapping, disaster management and regional planning. With the wider availability of very high resolution satellite images, deep learning based techniques have been broadly used for remote sensing image related tasks. In this study, we generated a new building dataset, Istanbul dataset, to be used for building segmentation task. 150 Pléiades image tiles of 1500 x 1500 pixels covering an area of 85 km² area of Istanbul city were used and approximately 40,000 buildings were labelled representing different building structures and spatial distribution. We extensively investigated the ideal architecture, encoder and hyperparameter settings for building segmentation tasks by using the new Istanbul dataset. More than 60 experiments were conducted by applying state-of-the-art architectures such as U-Net, Unet++, DeepLabv3+, FPN and PSPNet with different pre-trained encoders and hyperparameters. Our experiments showed that Unet++ architecture using SE-ResNeXt101 encoder pre-trained with ImageNet provides the best results with 93.8% IoU on Istanbul dataset. In order to prove our solutions generality, the ideal network has been also trained separately on Inria and Massachusetts building segmentation dataset. The networks have produced IoU values of 75.39% and 92.53% on Inria and Massachusetts datasets, respectively. The results indicate that our ideal network solution settings outperforms other methods in terms of building segmentation even without any specific architectural modification. The weights files and inference notebook is available on:
Species that went extinct prior to the genomic era are often considered out-of-reach for modern phylogenetic studies. This is particularly limiting for conservation studies, as genetic data from such taxa may be key to understanding extinction risks and causes of decline that can inform the management of related, extant populations. Fortunately, continual advances in biochemistry and DNA sequencing offer increasing ability to recover DNA from historical fluid-preserved museum specimens. Here, we report on success in recovering nuclear and mitochondrial data from the putative subspecies Desmognathus fuscus carri Neill 1951, a plethodontid salamander from spring runs in central Florida that is apparently extinct. The two ethanol-preserved topotypic specimens we studied are 50 years old and were likely fixed in unbuffered formalin, but application of a recently derived extraction procedure yielded usable DNA and partially successful Anchored Hybrid Enrichment sequencing. These data suggest that D. f. carri from peninsular Florida is conspecific with the D. auriculatus A lineage as suggested by previous authors, but may have represented an ecogeographically distinct population segment that has now been lost. Genetic data from this enigmatic disappearance thus confirm the geographic extent of population declines and extirpations as well as their ecological context, suggesting a possibly disproportionate loss from sandy-bottom clearwater streams compared to blackwater swamps. Success of these laboratory methods bodes well for large-scale application to fluid-preserved natural history specimens from relevant historical populations, but the possibility of significant DNA damage and related sequencing errors remains a hurdle to overcome.
We study the interaction of radio waves, microwaves, and infrared laser light of power P and period τ with a macroscopic thermoelectric (TEC) device-based detector and probe the energy Pτ as being the energy of these electromagnetic (EM) waves. Our detectors are in-series assemblies of TEC devices. We treat these detectors as equivalent to capacitors and/or inductors. The energy Pτ enables characterizing detector’s parameters, such as equivalent capacitance, inductance, resistance, responsivities, effective power, and efficiency. Through various scaling procedures, Pτ also aids in determining the power P of the EM waves. We compare the performance of our detectors with that of other TEC devices and with radio- and microwave-sensitive devices reported in the current literature, such as spin–orbit torque and spin–torque oscillator devices, heterojunction backward tunnel diodes, and Schottky diodes. We observe that the performance of our detectors is inferior. However, the order of magnitude of our detector’s parameters is in reasonable agreement with those of other TEC and non-TEC devices. We conclude that TEC devices can be used to detect radio waves and that Pτ effectively captures the energy of the EM waves. Considering Pτ as the EM wave’s energy offers a classical approach to the interaction of EM waves with matter in which photons are not involved. With the EM wave’s energy depending upon two variables (P and τ), a similar response could be produced by, e.g., radio waves and visible light, leading to interesting consequences that we briefly outline.
Based on an integrated theoretical framework, we argue that socially responsible firms aspire to higher ethical and moral standards than other firms and foster higher intrinsic motivation to avoid downsizing. In line with this, we develop hypotheses proposing a negative association of Corporate Social Responsibility (CSR) with downsizing incidence and downsizing severity. Using a panel data on U.S. firms over an eight‐year period, we confirm these hypotheses and find that CSR has a negative association with downsizing, which increases with the severity of downsizing. We discuss the implications of the findings and how our work contributes to the body of academic work on downsizing with CSR as an important novel firm‐level determinant that links to corporate sustainability and stakeholder engagement.
Abstract Dusky Salamanders (genus Desmognathus) currently comprise only 22 described, extant species. However, recent mitochondrial and nuclear estimates indicate the presence of up to 49 candidate species based on ecogeographic sampling. Previous studies also suggest a complex history of hybridization between these lineages. Studies in other groups suggest that disregarding admixture may affect both phylogenetic inference and clustering‐based species delimitation. With a dataset comprising 233 Anchored Hybrid Enrichment (AHE) loci sequenced for 896 Desmognathus specimens from all 49 candidate species, we test three hypotheses regarding (i) species‐level diversity, (ii) hybridization and admixture, and (iii) misleading phylogenetic inference. Using phylogenetic and population‐clustering analyses considering gene flow, we find support for at least 47 candidate species in the phylogenomic dataset, some of which are newly characterized here while others represent combinations of previously named lineages that are collapsed in the current dataset. Within these, we observe significant phylogeographic structure, with up to 64 total geographic genetic lineages, many of which hybridize either narrowly at contact zones or extensively across ecological gradients. We find strong support for both recent admixture between terminal lineages and ancient hybridization across internal branches. This signal appears to distort concatenated phylogenetic inference, wherein more heavily admixed terminal specimens occupy apparently artifactual early‐diverging topological positions, occasionally to the extent of forming false clades of intermediate hybrids. Additional geographic and genetic sampling and more robust computational approaches will be needed to clarify taxonomy, and to reconstruct a network topology to display evolutionary relationships in a manner that is consistent with their complex history of reticulation.
Under the pressure to improve strategy implementation effectiveness and organizational performance, several government officials look for new tools and methods to have greater public organizational performance. By improving the public organizations' performance, public value and citizens trust in government will be improved. Saudi Arabia is a country in Western Asia and one of the Middle East countries which depends on its natural resources (i.e. oil and gas production) as the primary source of income (Such countries are named rentier states). However, Saudi Arabia was trying to reduce its dependence on oil and to have a diverse set of products that feed its income since the ’70s of the last century. This notion has been injected by Saudi government officials into their national strategies (i.e. 5- year national development plans). However, several contemporary analyses show that Saudi still depends on oil as the primary source of income, which means that its national strategies implementation is not achieving its foremost objectives. Not achieving the strategic objectives keep the organisational performance low which accumulatively will have an impact on the whole government and the country’s economy. Based on a sample from Saudi public organisations, the research describes the strategy implementation dynamics and issues along with the drivers that have an impact on implementing effectiveness that leads to the organisational performance improvements in which -at the end- will lead to improvements in the economic performance of the country. One of the key findings of the research indicates that having a coordinated body for the strategy implementation activities across the organisation is the first ranked driver for having successful strategy implementation.
Background: Artificial intelligence (AI), including machine learning (ML) and deep learning, has the potential to revolutionize biomedical research. Defined as the ability to "mimic" human intelligence by machines executing trained algorithms, AI methods are deployed for biomarker discovery. Objective: We detail the advancements and challenges in the use of AI for biomarker discovery in ovarian and pancreatic cancer. We also provide an overview of associated regulatory and ethical considerations. Methods: We conducted a literature review using PubMed and Google Scholar to survey the published findings on the use of AI in ovarian cancer, pancreatic cancer, and cancer biomarkers. Results: Most AI models associated with ovarian and pancreatic cancer have yet to be applied in clinical settings, and imaging data in many studies are not publicly available. Low disease prevalence and asymptomatic disease limits data availability required for AI models. The FDA has yet to qualify imaging biomarkers as effective diagnostic tools for these cancers. Conclusions: Challenges associated with data availability, quality, bias, as well as AI transparency and explainability, will likely persist. Explainable and trustworthy AI efforts will need to continue so that the research community can better understand and construct effective models for biomarker discovery in rare cancers.
The article provides basic notions of derivative market, shows what these finance tools are, how they work and how capital can be increased with their help. By citing examples from historical development and simple conventional examples the mechanism of derivatives’ functioning was explained and their links with underlying assets were studied, for instance it was shown how the value of exchange tool can enter the negative zone. Exchange and OTC markets were analyzed, their value was investigated on the basis of historical statistics of international finance institutions and mechanisms of their interaction with regulators on the territory of the Russian Federation and abroad were researched. Strategies of using these finance tools for speculation, hedging and arbitration were demonstrated, their advantages and disadvantages were identified. As a result the authors put forward strategy of entering overestimated market with the help of options.
Introduction: Loss aversion when using gamification is incompletely understood. The aim of this study was therefore to examine how participants alter their behavior vis-a-vis meeting a daily step goal based on the prospect of losing or gaining a gamification level. Methods: We enrolled 602 participants across four arms who were given pedometers. In the three experimental arms, participants began at the medium level and were allocated 70 points each week, losing 10 points each day they did not meet their step goal. Having at least 40 points at the end of the week resulted in a level increase, otherwise they lost a level. We fit a generalized estimating equation, clustered on participants, modeling step goal attainment on day 7. Our primary predictor was a categorical variable simultaneously indicating what level the participants began the week in and whether they had more than, less than, or exactly 40 points after 6 days. Results: Participants at risk of losing the highest level were 18.40% (confidence interval [95% CI]: 18.26-19.90) more likely to meet their step goal than those who had secured the highest level. Participants who could potentially move from the low to the medium level were 10.61% (95% CI: 9.98-11.24) more likely to meet their step goal than those in the Control group. Those in the Medium group were similarly more likely to achieve their step goal on day 7 (10.00%, 95% CI: 9.15-10.85) than those who had already secured an increase to the high level. Discussion: We find that participants in this trial generally exhibit loss aversion so long as the loss relates to something that was earned rather than endowed. This knowledge can be incorporated in future interventions using gamification by requiring participants to earn all levels as they progress. identifier: NCT03311230.
Importance: Low surgical volume in the US Military Health System (MHS) has been identified as a challenge to military surgeon readiness. The Uniformed Services University of Health Sciences, in partnership with the American College of Surgeons, developed the Knowledge, Skills, and Abilities (KSA) Clinical Readiness Program that includes a tool for quantifying the clinical readiness value of surgeon workload, known as the KSA metric. Objective: To describe changes in US military general surgeon procedural volume and readiness using the KSA metric. Design, setting, and participants: This cohort study analyzed general surgery workload performed across the MHS, including military and civilian facilities, between fiscal year 2015 and 2019 and the calculated KSA metric value. The surgeon-level readiness among military general surgeons was calculated based on the KSA metric readiness threshold. Data were obtained from TRICARE, the US Department of Defense health insurance product. Main outcomes and measures: The main outcomes were general surgery procedural volumes and the KSA metric point value of those procedures across the MHS as well as the number of military general surgeons meeting the KSA metric readiness threshold. Aggregate facility and regional market-level claims data were used to calculate the procedural volumes and KSA metric readiness value of those procedures. Annual adjusted KSA metric points earned were used to determine the number of individual US military general surgeons meeting the readiness threshold. Results: The number of general surgery procedures generating KSAs in military hospitals decreased 25.6%, from 128 377 in 2015 to 95 461 in 2019, with a 19.1% decrease in the number of general surgeon KSA points (from 7 155 563 to 5 790 001). From 2015 to 2019, there was a 3.2% increase in both the number of procedures (from 419 980 to 433 495) and KSA points (from 21 071 033 to 21 748 984) in civilian care settings. The proportion of military general surgeons meeting the KSA metric readiness threshold decreased from 16.7% (n = 97) in 2015 to 10.1% (n = 68) in 2019. Conclusions and relevance: This study noted that the number of KSA metric points and procedural volume in military hospitals has been decreasing since 2015, whereas both measures have increased in civilian facilities. The findings suggest that loss of surgical workload has resulted in further decreases in military surgeon readiness and may require substantial changes in patient care flow in the MHS to reverse the change.
Objective This study assessed the cost-effectiveness of the Centers for Disease Control and Prevention’s (CDC’s) Sodium Reduction in Communities Program (SRCP). Design We collected implementation costs and performance measure indicators from SRCP recipients and their partner food service organizations. We estimated the cost per person and per food service organization reached and the cost per menu item impacted. We estimated the short-term effectiveness of SRCP in reducing sodium consumption and used it as an input in the Prevention Impact Simulation Model to project the long-term impact on medical cost savings and quality adjusted life years gained due to a reduction in cardiovascular disease and estimate the cost-effectiveness of SRCP if sustained through 2025 and 2040. Setting CDC funded eight recipients as part of the 2016–2021 round of the Sodium Reduction in Communities Program (SRCP) to work with food service organizations in eight settings to increase the availability and purchase of lower-sodium food options. Participants Eight SRCP recipients and 20 of their partners. Results At the recipient level, average cost per person reached was $10, and average cost per food service organization reached was $42,917. At the food service organization level, median monthly cost per food item impacted by recipe modification or product substitution was $684. Cost-effectiveness analyses showed that, if sustained, the program is cost saving (i.e. the reduction in medical costs is greater than the implementation costs) in the target population by $1.82 through 2025 and $2.09 through 2040. Conclusions By providing evidence of the cost-effectiveness of a real-world sodium reduction initiative, this study can help inform decisions by public health organizations about related cardiovascular disease prevention interventions.
Institution pages aggregate content on ResearchGate related to an institution. The members listed on this page have self-identified as being affiliated with this institution. Publications listed on this page were identified by our algorithms as relating to this institution. This page was not created or approved by the institution. If you represent an institution and have questions about these pages or wish to report inaccurate content, you can contact us here.
359 members
Kathleen T. Ashenfelter
  • Deloitte Consulting, Stategy and Operations, MIssion Analytics
Eli M Dow
  • Strategy and Analytics
Rhodri Dierst-Davies
  • Goverment and Publi Service
Christodoulos Kokotsakis
  • Department of Finance
New York City, NY, United States