University of Memphis
  • Memphis, United States
Recent publications
Disease tolerance reduces the per‐pathogen fitness costs of infection for hosts and is an important component of host adaptation to pathogens. However, how disease tolerance affects host transmission potential is not well understood, especially because there are many potential mechanisms that facilitate host tolerance. For example, tissue‐specific host tolerance leads to the reduction of host pathology, regardless of pathogen load. Hosts may also exhibit behavioral tolerance, where normal behaviors are maintained even while harboring high pathogen loads. Here, we examined the impacts that tissue‐specific and behavioral tolerance have on transmission in house finches (Haemorhous mexicanus) infected with a common and highly transmissible bacterial pathogen, Mycoplasma gallisepticum (MG). MG causes conjunctivitis in house finches and severely reduces population numbers after it arrives in a new area. Wild house finch populations differ in tissue‐specific tolerance to MG and here we assessed how this variation in tolerance influences transmission success. We inoculated wild‐captured, MG‐naïve individuals from two populations that are on the extremes of tissue‐specific tolerance to MG and determined the likelihood of these “index” individuals transmitting MG to an uninfected, susceptible cagemate. Higher tissue‐specific tolerance results in reduced conjunctivitis, which is associated with decreased deposition and spread of MG. Thus, we predicted that individuals with high tissue‐specific tolerance would be less likely to transmit MG. In contrast, we predicted that behavioral tolerance would be linked to higher transmission, as more tolerant individuals spent more time on a feeder shared with a susceptible individual despite high pathogen loads. In agreement with our prediction, individuals with high tissue‐specific tolerance were less likely to transmit MG. However, there was no effect of behavioral tolerance on the likelihood of MG transmission. Our results highlight that it is key to consider how different mechanisms of tolerance affect transmission and, therefore, host‐pathogen coevolution and epidemic dynamics.
Pathogen reinfections occur widely, but the extent to which reinfected hosts contribute to ongoing transmission is often unknown despite its implications for host-pathogen dynamics. House finches ( Haemorhous mexicanus ) acquire partial protection from initial exposure to the bacterial pathogen Mycoplasma gallisepticum (MG), with hosts readily reinfected with homologous or heterologous strains on short timescales. However, the extent to which reinfected hosts contribute to MG transmission has not been tested. We used three pathogen priming treatments—none, intermediate (repeated low-dose priming), or high (single high-dose priming)—to test how prior pathogen priming alters the likelihood of transmission to a cagemate during index bird reinfection with a homologous or heterologous MG strain. Relative to unprimed control hosts, the highest priming level strongly reduced maximum pathogen loads and transmission success of index birds during reinfections. Reinfections with the heterologous strain, previously shown to be more virulent and transmissible than the homologous strain used, resulted in higher pathogen loads within high-primed index birds and showed higher overall transmission success regardless of host priming treatment. This suggests that inherent differences in strain transmissibility are maintained in primed hosts, leading to the potential for ongoing transmission during reinfections. Finally, among individuals, transmission was most likely from hosts harboring higher within-host pathogen loads. However, associations between disease severity and transmission probability were dependent on a given bird’s priming treatment. Overall, our results indicate that reinfections can result in ongoing transmission, particularly where reinfections result from a highly transmissible strain, with potential implications for virulence evolution. IMPORTANCE As COVID-19 dramatically illustrated, humans and other animals can become infected with the same pathogen multiple times. Because individuals already have defenses against pathogens that their immune systems encountered before, reinfections are likely less contagious to others, but this is rarely directly tested. We used a songbird species and two strains of its common bacterial pathogen to study how contagious hosts are when their immune systems have some degree of prior experience with a pathogen. We found that reinfected hosts are not as contagious as initially infected ones. However, the more transmissible of the two strains, which also causes more harm to its hosts, was able to multiply more readily than the other strain within reinfected hosts and was more contagious in both reinfected and first-infected hosts. This suggests that reinfections might favor more harmful pathogen strains that are better able to overcome immune defenses.
Chert sourcing is conducted at various spatial scales from regional to local to match the scope of the human behavioral question asked. Understanding where past peoples acquired tool stone resources can span hundreds of kilometers from mountain ranges to open plains and across broad river valleys as the study attempts to both quantify and differentiate various material types and exploited deposits. However, to successfully characterize (quantify and differentiate) each potential source, data collection at a microscopic scale is often necessary. The study's primary goal is to examine the benefit of reflectance spectroscopy data at the nanometer scale using diagnostic atomic, molecular, and structural information locked inside chert to match artifacts to a geologic/geographic source. Working at Carson‐Conn‐Short, a terminal Pleistocene hunter‐gatherer site along the Tennessee River, United States, the analysis of 58 artifacts identified seven sources and sub‐sources. This study demonstrates how the collection of thousands of electromagnetic reflectance data per chert sample and artifact allows for the reconstruction of group mobility, social networks, selection decisions, and the use of a landscape of lithic resources. A case study using a terminal Pleistocene hunter‐gatherer site along the Tennessee River, United States is given to illustrate how human behavior can be learned from source data at the nanometer scale.
Rabbit, introduced by Boesgaard et al. (2003), is a high-speed stream cipher designed for 128-bit keys and an optional 64-bit initialization vector (IV). It operates using a state and counter vector, applying iterative transformations with XOR, bitwise rotations, and modular addition to ensure robust security without relying on S-boxes or lookup tables. This chapter proposes enhancements to Rabbit’s architecture, including modifications to the g-function, an improved counter update scheme, and a refined initialization process incorporating external pseudo-random number generators. These refinements enhance randomness and distribution properties, making Rabbit an even more secure and efficient stream cipher.
The Secure And Fast Encryption (SAFE) method presents a novel approach to generating pseudo-random sequences suitable for both simulations and cryptographic security. By introducing a non-linear mixing process, SAFE enhances existing PRNGs that already exhibit the HELP properties-high-dimensional equi-distribution, efficiency, long period length, and portability. While classical linear generators such as MRG and MT19937 are widely used in simulations, they lack cryptographic security. SAFE overcomes this limitation by incorporating non-linear transformations, preserving computational efficiency while significantly enhancing security.
Scientific computing and statistical applications rely heavily on computer-generated random numbers, forming the basis for methodologies such as random sampling and Markov Chain Monte Carlo (MCMC). This chapter explores foundational methods of pseudo-random number generation, emphasizing Inverse Transform Sampling as a core technique. Uniform random number generators (RNGs) serve as the building blocks for these methods, aiming to produce sequences that closely approximate true randomness. Key properties such as efficiency and long-period length are examined, leading to an in-depth discussion of Linear Congruential Generators (LCGs). The chapter further presents theoretical insights into the behavior and effectiveness of LCGs in computational settings.
Launched by ECRYPT in 2004, the eSTREAM project sought to advance the field of efficient and secure stream ciphers. The finalist ciphers were divided into two categories: Profile 1, optimized for high-performance software applications, features Rabbit, HC-256/HC-128, Salsa/ChaCha, and SOSEMANUK, whereas Profile 2, tailored for hardware-constrained environments, includes Grain v1, MICKEY 2.0, and Trivium. This chapter delves into the characteristics of these ciphers, with a particular focus on HC-128. Key aspects such as security, performance, simplicity, and clarity are evaluated, along with an in-depth analysis of HC-128’s structure, initialization, and keystream generation.
Identifying maximal-period large-order Multiple Recursive Generators (MRGs) necessitates advanced computational techniques. This chapter begins with an exploration of Algorithm AK, a well-established approach to searching for such generators. The discussion then shifts to the complexities involved in maximizing period lengths, including challenges in discovering suitable parameters. Algorithm GMP is introduced as a means of identifying generalized Mersenne primes, which play a pivotal role in constructing efficient MRGs. The chapter also examines critical computational steps, such as verifying primitive polynomials and analyzing prime factorization, culminating in a rigorous framework for the development of high-performance MRGs.
A new class of Multiple Recursive Generators, termed DW-k generators, is presented in this chapter. Defined by a characteristic polynomial of order k, these generators ensure primitive properties over a prime modulus p. Through the integration of matrix congruential techniques, the chapter demonstrates how DW-k generators achieve both maximal-period properties and efficient parallelizability. Optimized recurrence relations minimize computational overhead while maintaining high statistical quality. Empirical findings highlight constraints on parameter selection, reinforcing the practical applicability of DW-k generators in demanding simulation tasks.
The HC-128 stream cipher, a reduced variant of HC-256, is part of the eSTREAM portfolio and employs two 512-entry tables (P and Q) initialized with a 128-bit key and an initialization vector (IV). Despite its strong security and efficient key generation, HC-128 suffers from slow initialization and lacks rigorous theoretical analysis regarding its period length and distributional properties. The proposed eHC enhancement incorporates two external generators, making minimal modifications to the keystream generation process. These improvements lead to more efficient initialization and enhanced statistical properties, further solidifying HC-128’s reliability in cryptographic applications.
The efficient implementation of the spectral test is crucial for analyzing the lattice structure of multiplicative recursive generators (MRGs). This chapter first reviews conventional spectral test techniques, focusing on the construction of basis matrices via companion matrices. A detailed mathematical framework is presented to illustrate how sequentially generated values fit within a lattice structure. Building upon this foundation, a novel optimization method is proposed, significantly improving computational efficiency. The discussion integrates both theoretical insights and practical techniques, enhancing the feasibility of large-scale spectral testing.
Combining pseudo-random number generators offers substantial improvements over standalone methods, extending period lengths and strengthening statistical properties. This chapter investigates the theory and implementation of combined generators, comparing them to individual methods like LCGs and MRGs. Notable examples, such as the Wichmann-Hill generator and L’Ecuyer’s MRG32k3a, are discussed in depth. Theoretical foundations, including applications of the Chinese Remainder Theorem, provide insight into the effectiveness of these combination strategies. However, the increased computational cost and potential loss of equi-distribution at higher dimensions are also examined as trade-offs.
Ensuring the reliability of pseudo-random number generators requires rigorous evaluation, with the spectral test being a key analytical tool. This chapter introduces the spectral test as a method for assessing the uniformity of LCGs and MRGs, offering a quantitative means of ranking generator quality. Key theoretical aspects, such as lattice structure and equi-distribution, are explored in detail. Challenges associated with high-order MRGs, particularly in computational feasibility, are also discussed. The chapter concludes by setting the stage for an improved spectral test methodology, which will be developed further in subsequent discussions.
Improving the security and performance of linear random number generators is a crucial challenge. This chapter investigates various enhancement techniques, including combination generators such as the Wichmann-Hill method, which provides better uniformity and independence compared to traditional Lehmer congruential generators. Additionally, advanced methodologies like shuffle techniques, the TAC approach, and the DX generator are explored. The Addition, Rotation, and XOR (ARX) transformation is analyzed in detail, emphasizing its efficiency and cryptographic robustness. Empirical evaluations assess the impact of these improvements, demonstrating their effectiveness in secure and high-performance computing applications.
Enhancing the RC4 cipher has been a long-standing research focus. This chapter introduces eRC, a framework that integrates an external random number generator (RNG) to bolster both security and efficiency. Previous extensions of RC4, such as RC4A and RC4+, have shown improvements but still suffer from weaknesses in random index selection and table updates. The eRC framework addresses these limitations by continuously updating the S-table through an external RNG, resulting in improved uniformity, enhanced randomness, and stronger security. Furthermore, eRC achieves higher throughput, making it a robust alternative for stream cipher applications.
Expanding the capabilities of Multiple Recursive Generators involves leveraging designs with numerous nonzero terms. This chapter examines both the theoretical underpinnings and implementation challenges associated with such generators. While MRGs exhibit superior uniformity properties compared to LCGs, their efficient deployment remains a technical challenge, particularly in parallel computing. A novel approach to identifying maximum-period MRGs is introduced, alongside a method for improving their computational efficiency. These contributions enhance the practical viability of MRGs while maintaining their desirable statistical characteristics.
Parallel computing demands pseudo-random number generators that maintain independence and reproducibility across multiple processing units. This chapter introduces an automated method for parallelizing PRNGs, with a focus on Linear Congruential Generators (LCGs) and DX generators. Essential properties such as scalability, long periods, and efficiency are analyzed. Traditional LCG parallelization approaches are evaluated, revealing inherent limitations. The proposed method systematically constructs independent parallel sequences while mitigating overlap issues, ensuring improved performance in high-performance computing environments.
Stream cipher security is heavily dependent on the quality of the pseudo-random number generators (PRNGs) used to generate key streams. While true randomness is ideal, real-world implementations must rely on deterministic algorithms, making the development of high-quality PRNGs essential. This chapter explores the critical attributes of secure generators, such as unpredictability, efficiency, and statistical robustness. It compares PRNGs designed for computational simulations with those intended for cryptographic security, exposing the vulnerabilities of linear generators. The discussion covers key PRNGs, including LFSRs, MRGs, and MT19937, along with cryptographic alternatives like DRBGs, examining their strengths, weaknesses, and practical applications.
Achieving high-quality pseudo-random numbers necessitates sophisticated generator designs, particularly in scientific simulations. This chapter investigates the development and implementation of large-order Multiple Recursive Generators (MRGs), which surpass traditional Linear Congruential Generators in both period length and distribution properties. The limitations of classical approaches, including inadequate dimensional uniformity, highlight the need for MRGs. Various aspects, such as memory requirements and initialization costs, are discussed in relation to large-order MRGs. Additionally, this chapter introduces DX generators as an extension of MRGs, illustrating their empirical advantages and practical deployment strategies.
In this computational study, we self-consistently calculate the rate constants of mutual neutralization reactions by incorporating the electron transfer probability, using Landau–Zener state transition theory with inputs derived from ab initio quantum chemistry calculations, into classical trajectory simulations. Electronic structure calculations are done using correlation consistent basis sets with multi-reference configuration interaction to map all the molecular electronic states below the ion-dissociation limit as a function of the distance between the reacting species. Our electronic structure calculations have been significantly improved from our previous work [Liu et al., J. Chem. Phys. 159, 114111 (2023)] through improved selection of molecular electronic configurations maintaining a fine grid of 1a0 over a wide range of bond lengths and accurate treatment of spin–orbit couplings. Non-adiabatic coupling matrix elements are calculated with the three-point central difference method near each avoided crossing to estimate the exact crossing point Rx and coupling parameter Hif, which are inputs to the multi-channel Landau–Zener theory to calculate the electron transition probability. Our approach is applied to estimate the mutual neutralization rate constants for the following ion pairs: Ar⁺–Cl⁻, Ar⁺–Br⁻, Ar⁺–I⁻ at ∼133 Pa. Our predictions are compared against the experimental data reported by Shuman et al. [J. Chem. Phys. 140, 044304 (2014)]. It is seen that the improvement in the electronic structure calculation results in excellent agreement between the simulation results and the available experimental data to within a factor of ∼2 or ∼±50%.
Institution pages aggregate content on ResearchGate related to an institution. The members listed on this page have self-identified as being affiliated with this institution. Publications listed on this page were identified by our algorithms as relating to this institution. This page was not created or approved by the institution. If you represent an institution and have questions about these pages or wish to report inaccurate content, you can contact us here.
5,532 members
John Leicester Williams
  • Department of Biomedical Engineering
Shongkour Roy
  • School of Public Health
Shaun Gallagher
  • Department of Philosophy
Information
Address
Memphis, United States
Head of institution
M. David Rudd