College of William and Mary
  • Williamsburg, United States
Recent publications
An edge (vertex) cut X of G is r-essential if G−X has two components each of which has at least r edges. A graph G is r-essentially k-edge-connected (resp. k-connected) if it has no r-essential edge (resp. vertex) cuts of size less than k. If r=1, we simply call it essential. Recently, Lai and Li proved that every m-edge-connected essentially h-edge-connected graph contains k edge-disjoint spanning trees, where k,m,h are positive integers such that k+1≤m≤2k−1 and h≥m2m−k−2. In this paper, we show that every m-edge-connected and 2-essentially h-edge-connected graph that is not a K5 or a fat-triangle with multiplicity less than k has k edge-disjoint spanning trees, where k+1≤m≤2k−1 andh≥f(m,k)={2m+k−4+k(2k−1)2m−2k−1,m<k+1+8k+14,m+3k−4+k2m−k,m≥k+1+8k+14. Extending Zhan's result, we also prove that every 3-edge-connected essentially 5-edge-connected and 2-essentially 8-edge-connected graph has two edge-disjoint spanning trees. As an application, this gives a new sufficient condition for Hamilton-connectedness of line graphs. In 2012, Kaiser and Vrána proved that every 5-connected line graph of minimum degree at least 6 is Hamilton-connected. We allow graphs to have minimum degree 5 and prove that every 5-connected essentially 8-connected line graph is Hamilton-connected.
In order to combat information operations (IO) and disinformation campaigns, one must look at the behaviors of the accounts pushing specific narratives and stories through social media, not at the content itself. In this work, we present a new process for extracting tweet storms and uncovering networks of accounts that are working in a coordinated fashion using ridge count thresholding (RCT). To do this, we started with a dataset of 60 million individual tweets from the early weeks of the Covid-19 pandemic. Coherent topics are extracted from this data by testing three different preprocessing pipelines and applying Orthogonal Nonnegative Matrix Factorization (ONMF). The most effective preprocessing pipeline used hashtag preclustering to downselect the total dataset to the 7 million tweets that included the top hashtags. Each topic identified by ONMF is described by a topic-tweet signal, crafted using the time stamp included in each tweet’s metadata. These signals were broken down into tweet storms using RCT, which is calculated from the Dynamic Wavelet Fingerprint transform of each topic-tweet signal. Each tweet storm described a time of increased activity around a topic. Tweet storms identified in this way each represent some behavior in the underlying network. In total, we identified 39,817 total tweet storms that included about 2 million unique tweets. These tweet storms were used to identify networks of accounts that commonly co-occur within tweet storms to isolate those communities most responsible for driving narratives and pushing stories through social media. Through this process, we were able to identify 22 unique networks of accounts that were densely connected based on RCT tweet storm identification. Many of the identified networks exhibit obvious inauthentic behaviors that are potentially a part of an IO campaign.
How efficient is the targeting of foreign aid to populations in need? A long literature has focused on the impacts of foreign aid, but much rarer are studies that examine how such aid is allocated within countries. We examine the extent to which donors efficiently respond to exogenous budget shocks by shifting resources toward needier districts within a given country, as predicted by theory. We use recently geocoded data on the World Bank’s aid in 23 countries that crossed the lower-middle income threshold between 1995 and 2010 and thus experienced sharp aid reductions. We measure locations’ need along a number of dimensions, including nighttime lights emissions, population density, conflict exposure, and child mortality. We find little evidence that aid project siting is increasingly concentrated in worse-off areas as budgets shrink; the only exception appears to be a growing share of funding in more conflict-affected areas. We further analyze the relationship of health aid to child mortality measures in six key countries, again finding little evidence of efficient responses to budget shocks. Taken together, these results suggest that large efficiency gains may be possible in the distribution of aid from the World Bank and other donors.
In [7, Theorem 1.1], it is claimed that every finite 2-group of exponent 4 occurs as the group of units of a finite ring with characteristic 2. We now know this claim to be false: specifically, [7, Proposition 3.9] and its proof are incorrect. The purpose of this note is to provide a revised statement of [7, Theorem 1.1], which holds for any finite 2-group of exponent 4 and nilpotency class at most 2 and for some (perhaps most) finite 2-groups of exponent 4 and nilpotency class 3. We also exhibit an example of a group of order 64 with exponent 4 and nilpotency class 4 that is not realizable in characteristic 2.
Data redundancy is ubiquitous in the inputs and intermediate results of Deep Neural Networks (DNN). It offers many significant opportunities for improving DNN performance and efficiency and has been explored in a large body of work. These studies have scattered in many venues across several years. The targets they focus on range from images to videos and texts, and the techniques they use to detect and exploit data redundancy also vary in many aspects. There is not yet a systematic examination and summary of the many efforts, making it difficult for researchers to get a comprehensive view of the prior work, the state of the art, differences and shared principles, and the areas and directions yet to explore. This article tries to fill the void. It surveys hundreds of recent papers on the topic, introduces a novel taxonomy to put the various techniques into a single categorization framework, offers a comprehensive description of the main methods used for exploiting data redundancy in improving multiple kinds of DNNs on data, and points out a set of research opportunities for future to explore.
The confrontation of prejudicial acts and comments promotes multiple benefits, most notably the prevention of future prejudicial remarks and the reduction of stereotype use. Research, however, consistently shows low rates of confronting prejudice, particularly regarding sexism. Here, we examine whether personal sense of power, known to increase action and activate the behavioral approach system, increases the likelihood of confronting a sexist remark. In Study 1, we demonstrate that for both women and men, self-reported power is associated with a higher frequency of confronting sexism. In Study 2, we manipulate women’s sense of power (i.e., high power, low power, or control) and subsequently present an opportunity to confront a sexist remark. Results show that women primed to feel powerful were more likely to confront the sexist remark and expressed greater disagreement with the comment, compared to women primed to feel powerless. Implications for the confronting literature and behavior are discussed.
Seasonality is a natural feature of wild caught fisheries that introduces variation in food supply, and which often is amplified by fisheries management systems. Seasonal timing of landings patterns and linkages to consumption patterns can have a potentially strong impact on income for coastal communities as well as import patterns. This study characterizes the relationship between seasonality in seafood production and consumption in the United States by analyzing monthly domestic fisheries landings and imports and retail sales of farmed and wild seafood from 2017 to 2019. Analyses were conducted for total seafood sales, by product form, by species group, and by region of the United States. The data reveal strong seasonal increases in consumption around December and March. Seasonal increases in consumption in Spring and Summer occurred in parallel with domestic fishing production. Domestic landings vary by region, but most regions have peak fishing seasons between May and October. Alaska has the largest commercial fishery in the United States and seasonal peaks in Alaska (July/August, February/March) strongly influence seasonality in national landings. Misalignment between domestic production and consumption in some seasons and species groups creates opportunities for imports to supplement demand and lost opportunities for domestic producers.
This convergent mixed-methods study examined the distinct burnout profiles of novice school counselors and their respective professional experiences. A cluster analysis yielded a three-factor solution revealing unique burnout profiles. Qualitative analyses identified job-related challenges and resources impacting the work performance of novice school counselors. We provide recommendations for school counselor training.
Background Many Americans have adopted popular diet patterns for general health improvement that restrict specific foods, macronutrients, or eating time. However, there is limited evidence to characterize the quality of these diet patterns. Objectives This study 1) evaluated the quality of popular diet patterns in the United States and 2) modeled the effect of targeted food substitutions on diet quality. Methods Dietary data from 34,411 adults ≥20 y old were acquired from the NHANES, 2005–2018. Dietary intake was assessed using the National Cancer Institute's usual intake methodology, and the Healthy Eating Index-2015 was used to evaluate diet quality. A diet model was used to evaluate the effect of targeted food substitutions on diet quality. Results A pescatarian diet pattern had the highest diet quality (65.2; 95% CI: 64.0, 66.4), followed by vegetarian (63.0; 95% CI: 62.0, 64.0), low-grain (62.0; 95% CI: 61.6, 62.4), restricted-carbohydrate (56.9; 95% CI: 56.6, 57.3), time-restricted (55.2; 95% CI: 54.8, 55.5), and high-protein (51.8; 95% CI: 51.0, 62.7) diet patterns. Modeled replacement of ≤3 daily servings of foods highest in added sugar, sodium, saturated fat, and refined grains with alternative foods led to an increase in diet quality and a decrease in energy intake for most diet patterns. Conclusions Low diet quality was observed for all popular diet patterns evaluated in this study. Modeled dietary shifts that align with recommendations to choose foods lower in added sugar, sodium, saturated fat, and refined grains led to modest improvements in diet quality and larger reductions of energy intake. Greater efforts are needed to encourage the adoption of dietary patterns that emphasize consumption of a variety of high-quality food groups.
The establishment of anterior–posterior (AP) regional identity is an essential step in the appropriate development of the vertebrate central nervous system. An important aspect of AP neural axis formation is the inherent plasticity that allows developing cells to respond to and recover from the various perturbations that embryos continually face during the course of development. While the mechanisms governing the regionalization of the nervous system have been extensively studied, relatively less is known about the nature and limits of early neural plasticity of the anterior–posterior neural axis. This study aims to characterize the degree of neural axis plasticity in Xenopus laevis by investigating the response of embryos to a 180-degree rotation of their AP neural axis during gastrula stages by assessing the expression of regional marker genes using in situ hybridization. Our results reveal the presence of a narrow window of time between the mid- and late gastrula stage, during which embryos are able undergo significant recovery following a 180-degree rotation of their neural axis and eventually express appropriate regional marker genes including Otx, Engrailed, and Krox. By the late gastrula stage, embryos show misregulation of regional marker genes following neural axis rotation, suggesting that this profound axial plasticity is a transient phenomenon that is lost by late gastrula stages.
Although it was reported in 2012 that 89% of the world’s population had access to piped water, it is estimated that at least one billion people receive this water for fewer than 24 h per day. Intermittency places a variety of burdens upon households, including inadequate quantities of supply at the household level, unpredictability of water utilities in making water available, and a disproportionate time burden on poorer households. For many intermittent water systems, the availability of water is controlled by valvemen who turn access on/off to various portions of their service area. Using this information, NextDrop sends notifications via mobile phones to customers as to when water is likely to be available. Although a pilot of NextDrop was successfully implemented in Hubli-Dharwad in India, NextDrop faced significant challenges when expanding to Bangalore. This case study investigates how a breakdown in the information pipeline, as well as corresponding human factors, prevented adoption of NextDrop in Bangalore. Specifically, randomized controlled trials found that valvemen sent reports of their activities to NextDrop only 70% of the time. Even when NextDrop passed messages onto customers, only 38% of customers reported receiving notifications, primarily because either the household “waiters” for water, usually women, did not have daytime access to the mobile phone registered with NextDrop or the notifications are buried under the many other solicitations and informational messages regularly received via SMS. Valvemen were further studied through observation and semi-structured interviews to understand their incentives for complying with NextDrop.
The authors explore the historical context and how it affects gifted students’ psychological and physical well-being. Using Bronfenbrenner’s systems approach, the authors examine events from 1999 to 2022 and how they may have influenced this generation of students. The authors close with ways of dealing with these changes by increasing an ethic of care for one another, dealing with the changes in our times with innovation and creative solutions, and focusing on ways to create a secure world for children that puts care for their positive well-being first.
Learning to teach mathematics is a complex endeavor, requiring sustained focus and time. Yet time is especially scarce in elementary teacher education programs, where preservice teachers (PSTs) learn all content areas. Through a collaborative self-study, five teacher educators identified three time-related tensions in elementary mathematics methods courses: (a) teaching mathematics content and pedagogy; (b) connecting theory and practice; and (c) promoting social contexts in teaching mathematics. To address these tensions, we offer three design principles and illustrative examples: (a) addressing multiple goals for each course component; (b) developing PSTs’ dispositions over time; and (c) building on PSTs’ strengths to develop understanding of mathematics. We present a reflection tool to assist matsshematics teacher educators in designing their courses to maximize their instructional time.
Background: Alzheimer's disease is a specific form of dementia characterized by the aggregation of amyloid-β plaques and tau tangles. New research has found that the formation of these aggregates occurs after dysregulation of cellular respiration and the production of radical oxygen species. Proteomic data shows that these changes are also related to unique gene expression patterns. Objective: This study is designed to incorporate both proteomic and gene expression data into a testable mathematical model for AD. Manipulation of this new model allows the identification of potential therapeutic targets for AD. Methods: We investigate the impact of these findings on new therapeutic targets via metabolic flux analysis of sirtuin stress response pathways while also highlighting the importance of metabolic enzyme activity in maintaining proper respiratory activity. Results: Our results indicate that protective changes in SIRT1 and AMPK expression are potential avenues for therapeutics. Conclusion: Combining our mitochondrial gene expression analyses with available protein data allowed the construction of a new mathematical model for AD that provides a useful approach to test the efficacy of potential AD therapeutic targets.
Background: Alzheimer's disease (AD) is a neurological disease that has both a genetic and non-genetic origin. Mitochondrial dysfunction is a critical component in the pathogenesis of AD as deficits in oxidative capacity and energy production have been reported. Objective: Nuclear-encoded mitochondrial genes were studied in order to understand the effects of mitochondrial expression changes on mitochondrial function in AD brains. These expression data were to be incorporated into a testable mathematical model for AD used to further assess the genes of interest as therapeutic targets for AD. Methods: RT2-PCR arrays were used to assess expression of 84 genes involved in mitochondrial biogenesis in AD brains. A subset of mitochondrial genes of interest was identified after extensive Ingenuity Pathway Analysis (IPA) (Qiagen). Further filtering of this subset of genes of interest was achieved by individual qPCR analyses. Expression values from this group of genes were included in a mathematical model being developed to identify potential therapeutic targets. Results: Nine genes involved in trafficking proteins to mitochondria, morphology of mitochondria, maintenance of mitochondrial transmembrane potential, fragmentation of mitochondria and mitochondrial dysfunction, amyloidosis, and neuronal cell death were identified as significant to the changes seen. These genes include TP53, SOD2, CDKN2A, MFN2, DNM1L, OPA1, FIS1, BNIP3, and GAPDH. Conclusion: Altered mitochondrial gene expression indicates that a subset of nuclear-encoded mitochondrial genes compromise multiple aspects of mitochondrial function in AD brains. A new mathematical modeling system may provide further insights into potential therapeutic targets.
Grounded in theories of workerism (operaismo) and post-workerism (autonomia and post-autonomia), this paper examines how Federico Rizzo’s Fuga dal call center and, more extensively, Paolo Virzì’s Tutta la vita davanti, both films from 2008, register the shift from material to immaterial labor in the twenty-first century. In both films, the call center replaces the factory as the space of labor; it is the privileged site where relational and affective skills, once not considered economically exploitable, become a source of surplus value, one that is gendered feminine. Against the exterior backdrop of a postmodern Rome and with interiors that mimic the non-spaces of the airport and television studio, Virzì stages work in the world of the hyper-real, which has become reality itself. Though they are works of fiction, heirs to the tradition of the 1960s commedia all’italiana, both films engage with an “ethics of the real” as Lucia Nagib has theorized it, and are concerned with addressing the socio-economic condition of a twenty-first-century class-in-the-making, the precariat, and its uncertain future.
Motivation High-throughput fluorescent microscopy is a popular class of techniques for studying tissues and cells through automated imaging and feature extraction of hundreds to thousands of samples. Like other high-throughput assays, these approaches can suffer from unwanted noise and technical artifacts that obscure the biological signal. In this work we consider how an experimental design incorporating multiple levels of replication enables removal of technical artifacts from such image-based platforms. Results We develop a general approach to remove technical artifacts from high-throughput image data that leverages an experimental design with multiple levels of replication. To illustrate the methods we consider microenvironment microarrays (MEMAs), a high-throughput platform designed to study cellular responses to microenvironmental perturbations. In application on MEMAs, our approach removes unwanted spatial artifacts and thereby enhances the biological signal. This approach has broad applicability to diverse biological assays. Availability Raw data is on synapse (syn2862345), analysis code is on github: gjhunt/mema_norm, a reproducible Docker image is available on dockerhub: gjhunt/mema_norm. Supplementary information Supplementary data are available at Bioinformatics online.
Let X be the number of successes in n mutually independent and identically distributed Bernoulli trials, each with probability of success p. For fixed n and α\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\alpha $$\end{document}, there are n+1\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$n + 1$$\end{document} distinct two-sided 100(1-α)\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$100(1 - \alpha )$$\end{document}% confidence intervals for p associated with the outcomes X=0,1,2,…,n\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$${X = 0, 1, 2, \ldots , n}$$\end{document}. There is no known exact non-randomized confidence interval for p. Existing approximate confidence interval procedures use a formula, which often requires numerical methods to implement, to calculate confidence interval bounds. The bounds associated with these confidence intervals correspond to discontinuities in the actual coverage function. The paper does not aim to provide a formula for the confidence interval bounds, but rather to select the confidence interval bounds that minimize the root mean square error of the actual coverage function for sample size n and significance level α\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\alpha $$\end{document} in the frequentist context.
Institution pages aggregate content on ResearchGate related to an institution. The members listed on this page have self-identified as being affiliated with this institution. Publications listed on this page were identified by our algorithms as relating to this institution. This page was not created or approved by the institution. If you represent an institution and have questions about these pages or wish to report inaccurate content, you can contact us here.
3,679 members
Peter Vishton
  • Department of Psychology
Carlos Ayerbe Gayoso
  • Department of Physics
William Cameron Walton
  • Virginia Institute of Marine Science (VIMS)
Joanna Schug
  • Department of Psychology
Don Rahtz
  • Mason School of Business
Williamsburg, United States