AFB International
  • St. Louis, United States
Recent publications
Crop microbiome-phyllosphere interactions are vital for intercropping disease control. While evidence suggests intercropping enhances nutrient efficiency and recruits beneficial bacteria at the root-soil interface, limited research has explored changes in the phyllosphere microbiome in response to foliar diseases. Pot experiments were conducted to investigate how wheat stripe rust, caused by Puccina striiformis f. sp. tritici (Pst) affects shoot biomass, total nitrogen and total potassium content, changes of phyllosphere bacteria, co-occurrence networks and ecological functions in wheat and faba bean intercropping using 16S amplicon sequencing in. Microbial characterization revealed that Pst infection altered the phyllosphere bacteria of wheat and faba bean, reducing evenness and richness, shifting community structure with the emergence of new bacteria, and weakening microbial networks under both planting patterns. Compared to monocropping, intercropping displayed bacterial features of an integrated system, fortified wheat community diversity with new bacteria such as Pseudomonas aeruginosa in wheat and Pantoea and Burkholderia in faba bean, and strengthened the bacterial network of wheat in response to Pst infection. Intercropping enhanced nitrogen and carbon cycling in wheat leaves by improving nitrite respiration and aromatic hydrocarbon degradation. The enriched phyllosphere bacterial structure positively correlated with the crop’s performance by balancing shoot biomass, nitrogen, and potassium accumulation between wheat and faba bean upon Pst infection to mitigate the disease index of wheat. Overall, these changes may positively correlate with functions aiding wheat in suppressing wheat stripe rust. The findings provide new insight into understanding how intercropping improves response to pathogen challenge as an effective planting practice.
The surge in the global demand for plant-based proteins has catapulted pulse protein into the spotlight. To ensure economic viability and sustainable production, it is crucial to utilize pulse starch, a by-product of plant protein fractionation. Despite the increasing interest in pulse starches, there is a notable gap in knowledge regarding their modifications and applications compared to cereal and tuber starches. Non-thermal techniques such as electron beam radiation, static high pressure, microfluidization, and cold plasma are emerging as innovative methods for starch modification. These techniques offer significant advantages, including enhanced safety, environmental sustainability, and the development of unique functional properties unattainable through conventional methods. However, challenges such as equipment availability, high costs, and energy consumption hinder their widespread adoption. In light of the growing emphasis on “clean and green labelling” and effective “waste management” in food production, evaluating non-thermal techniques for pulse starch modification is critical. This review aims to thoroughly assess these non-thermal techniques and their combinations, offering valuable insights for researchers and the food industry. By maximizing the potential of pulse starches in innovative food applications, it provides a comprehensive guide for effective non-thermal methods that add value and align with sustainable practices.
Plant proteins often carry off-notes, necessitating customized aroma addition. In vitro studies revealed protein–aroma binding, limiting release during consumption. This study employs in vivo nose space proton transfer reaction-time-of-flight–mass spectrometry and dynamic sensory evaluation (time intensity) to explore in-mouth interactions. In a lupin protein-based aqueous system, a sensory evaluation of a trained “green” attribute was conducted simultaneously with aroma release of hexanal, nonanal, and 2-nonanone during consumption. Results demonstrated that enlarging aldehyde chains and relocating the keto group reduced maximum perceived intensity (Imax_R) by 71.92 and 72.25%. Protein addition decreased Imax_R by 30.91, 36.84, and 72.41%, indicating protein–aroma interactions. Sensory findings revealed a perceived intensity that was lower upon protein addition. Aroma lingering correlated with aroma compounds’ volatility and hydrophobicity, with nonanal exhibiting the longest persistence. In vitro mucin addition increased aroma binding four to 12-fold. Combining PTR-ToF-MS and time intensity elucidated crucial food behavior, i.e., protein–aroma interactions, that are pivotal for food design.
Two distinct ultra-thin Ge1−xSnx (x ≤ 0.1) epilayers were deposited on (001) Si substrates at 457 and 313 °C through remote plasma-enhanced chemical vapor deposition. These films are considered potential initiation layers for synthesizing thick epitaxial GeSn films. The GeSn film deposited at 313 °C has a thickness of 10 nm and exhibits a highly epitaxial continuous structure with its lattice being compressed along the interface plane to coherently match Si without mismatch dislocations. The GeSn film deposited at 457 °C exhibits a discrete epitaxial island-like morphology with a peak height of ∼30 nm and full-width half maximum (FWHM) varying from 20 to 100 nm. GeSn islands with an FWHM smaller than 20 nm are defect free, whereas those exceeding 25 nm encompass nanotwins and/or stacking faults. The GeSn islands form two-dimensional modulated superlattice structures at the interface with Si. The GeSn film deposited at 457 °C possesses a lower Sn content compared to the one deposited at lower temperature. The potential impact of using these two distinct ultra-thin layers as initiation layers for the direct growth of thicker GeSn epitaxial films on (001) Si substrates is discussed.
Food protein-flavor binding influences flavor release and perception. The complexity of the binding phenomenon lies in the flavor and protein properties. Thus, molecular interactions between commercial whey- or plant-based protein isolates (PI) such as pea, soy, and lupin, with carbonyl and alcohol flavor compounds were assessed by static headspace (HS) GC-MS. HS results showed that not only the displacement of the carbonyl group from the inner part of the flavor structure toward the edge promoted binding up to 52.76% ± 4.65 but also the flavor’s degree of unsaturation. Similarly, thermal treatment led to a slight increase in hexanal-protein binding because of possible protein conformational changes. Protein’s residual fat (<1%) seemed insufficient to promote significant flavor binding to PI. Despite the complexity of commercial food protein isolates, the results displayed that binding is predominantly influenced by the flavor structure and physicochemical properties, with the protein source and residual fat playing a secondary role.
Prism adaptation (PA) is a well-known and widely used technique for rehabilitating unilateral spatial neglect and studying sensory–motor plasticity. However, there is conflicting evidence in the literature regarding its effectiveness which may arise from differences in the type of prisms used, clinical characteristics of the patients, and the procedure used in training. Individual differences may play a role in PA effectiveness in rehabilitating neglect, affecting both its development and its effects. Field-dependent/independent cognitive style is a pervasive characteristic of individual functioning, affecting how environmental information is processed. Here, we tested the hypothesis that cognitive style plays a role in PA efficacy by submitting to a protocol of prism adaptation to 38 health participants, who were classified as field-dependent (FD, N = 19) or field-independent (FI, N = 19), by using the Embedded Figure Test. Results show that during the exposure phase, FI individuals needed a lesser number of pointing movements to reduce the deviation error than FD individuals. However, there are no differences in the extinction of sensory–motor and cognitive after-effects. These results suggest that prismatic adaptation is affected by individuals’ cognitive style since FI individuals will need fewer trials to reach adaptation and this could explain why using this rehabilitation technique with a unique, standard protocol is not always effective.
Food flavourings are often added to enhance overall flavour experience during food consumption and their use in plant‐based food analogues is crucial. The flavour sensory perception is strongly mediated by flavour–food matrix interactions. Aroma molecules establish chemical and physical bonds with lipids and carbohydrates, but proteins play a pivotal role having a strong flavour binding minimizing aroma release and quenching flavour perception. Consequently, final food quality is reduced and so is consumer acceptance. Depending on the chemical structure of the flavour and the type of protein involved, the strength of the flavour–protein binding can vary. It is of great importance to understand if this interaction remains under dynamic conditions such as the oral processing during food consumption. This review aims to gain insights into the influence of the chemical structure and physicochemical features of both proteins and flavours on the binding mechanism. Moreover, the potentiality of coupling in vivo instrumental analysis with sensory methods to study flavour release in real time is explored. Elucidating the drivers of flavour interaction with, and release from, the food matrix is essential to develop flavour solutions for the new generation of plant‐based food products. Protein‐flavour binding affects the final food flavour quality. Molecule's structure is key on the protein–flavour binding. Coupling real‐time analyses and sensory methods facilitates querying on flavour release and its correlation to flavour perception.
Introduction This lessons learned paper provides recommendations for novice investigators to consider when writing a research protocol; specifically when it involves clinical staff with varying levels of research experience, multiple departments, and is conducted at a non-academic medical center. It further explores each specific lesson with generalizability to help future novice investigators successfully develop and implement their own research study. Methods There were several lessons learned during the development and implementation of the research teams’ original study. These lessons include: (1) Conduct feasibility assessments; (2) Assess external factors; (3) Partner with stakeholder(s); (4) Develop tools that promote transparency; (5) Coordinate with Information Technology personnel; and (6) Engage and educate stakeholders. Conclusion The aim of this study was to determine if unrestricted oral intake of low fat, low residue foods during labor impacts maternal and neonatal outcomes, with the goal of contributing an adequately powered study to the current literature. Due to the challenges experienced in executing this study, the findings were not able to be generalized. However, the challenges encountered are not specific to the original focus of the researchers’ study. Each of the lessons are generalizable and can be applied to nursing research. As nurses begin to develop clinical research protocols, utilizing the lessons learned in this paper may help ensure successful implementation and completion of their research.
Maritime transportation is a major contributor to the world economy, but has significant social and environmental impacts. Each impact calls for different technical or operational solutions. Amongst these solutions, we found that speed reduction measures appear to mitigate several issues: (1) collision with wildlife; (2) collision with non-living objects; (3) underwater noise; (4) invasive species; and (5) gas emission. We do not pretend that speed reduction is the best solution for each individual issue mentioned in this paper, but we argue that it could be a key solution to significantly reduce these threats all together. Further interdisciplinary research is required to balance private economic costs of speed reduction measures with environmental and social benefits emerging from all mitigated issues.
As this is more of a refernce article I chose not to have an abstract similiar to the paper I wrote in 2016 regarding flat feet in the military
We consider the bilevel quadratic knapsack problem (BQKP) that model settings where a leader appropriates a budget for a follower, who solves a quadratic 0–1 knapsack problem. BQKP generalizes the bilevel knapsack problem introduced by Dempe and Richter (Cent Eur J Oper Res 8(2):93–107, 2000), where the follower solves a linear 0–1 knapsack problem. We present an exact-solution approach for BQKP based on extensions of dynamic programs for QKP bounds and the branch-and-backtrack algorithm. We compare our approach against a two-phase method computed using an optimization solver in both phases: it first computes the follower’s value function for all feasible leader’s decisions, and then solves a single-level, value-function reformulation of BQKP as a mixed-integer quadratically constrained program. Our computational experiments show that our approach for solving BQKP outperforms the two-phase method computed by a commercial state-of-the-art optimization software package.
Introduction In collaboration with the ECHO (Extension for Community Healthcare Outcomes) Institute since 2012, the Army, Navy, and Air Force have developed medical teleECHO programs to address various health and safety issues affecting military personnel. This article describes and compares the current state of military teleECHOs as well as the growth and change over time. Materials and Methods This study evaluated continuing education units (CEUs) offered, average session attendance, and number of spoke sites for current military teleECHO programs across the service branches. Results Between 2012 and 2019, the military teleECHO initiative grew from one program to seven different teleECHO programs, covering topics from pain to diabetes to amputee care. Military ECHOs now provide training to 10 countries and 27 states in the United States. Between October 2018 and September 2019, the military ECHO programs provided a total of 51,769 continuing medical education (CME) hours to a total of 3,575 attendees from 223 spoke sites. Conclusions The military has successfully used the ECHO model to improve the health and safety of active-duty military, retirees, and dependents.
When implementing a sanitation system, the selection of treatment process can be difficult. Beyond removal efficiency and effluent concentrations, reliability should be taken into account. This study compares reliability of French vertical flow treatment wetlands (F-VFTW) with the four main decentralized wastewater treatment technologies in small communities in the French Overseas Territories (FOT). Analysis of 963 regulatory self-monitoring sampling campaigns performed on 213 wastewater treatment plants show that operational disruptions due to sludge loss and loss of nitrification are often reported for activated sludge technology; rotating biological contactors often suffer from weak settlement; facultative pond removal is limited by algae; and F-VFTW fulfills all the French regulatory objectives at a frequency of 90 to 95%. In addition, the data from this study are compared to a similar database from Brazil using a statistical approach (coefficient of reliability). Amongst the eight decentralized wastewater treatment technologies evaluated, F-VFTW appears to be the most appropriate for achieving the discharge standard with a reliability close to 95%. Its reliability to face both environmental (rainfall) and social (maintenance capacities) constraints is a key parameter.
We evaluated the effectiveness of NORTH STAR, a community assessment, planning, and action framework to reduce the prevalence of several secretive adult problems (hazardous drinking, controlled prescription drug misuse, suicidality, and clinically significant intimate partner violence and child abuse [both emotional and physical]) as well as cumulative risk. One-third of US Air Force (AF) bases worldwide were randomly assigned to NORTH STAR (n = 12) or an assessment-and-feedback-only condition (n = 12). Two AF-wide, cross-sectional, anonymous, web-based surveys were conducted of randomly selected samples assessing risk/protective factors and outcomes. Process data regarding attitudes, context, and implementation factors were also collected from Community Action Team members. Analyzed at the level of individuals, NORTH STAR significantly reduced intimate partner emotional abuse, child physical abuse, and suicidality, at sites with supportive conditions for community prevention (i.e., moderation effects). Given its relatively low cost, use of empirically supported light-touch interventions, and emphasis on sustainability with existing resources, NORTH STAR may be a useful framework for the prevention of a range of adult behavioral health problems that are difficult to impact.
This study examines patterns of skeletal trauma in propeller‐driven aircraft crashes and blast‐related ground loss incidents from WWII. Specifically, descriptions and criteria used to characterize aircraft deceleration‐ versus blast‐related skeletal injuries are examined from 35 recently identified forensic anthropology cases to determine possible diagnostic traits and characterize skeletal trauma associated with these events. Among these cases, blast trauma is more localized within the skeleton and is associated with one or few primary directions of force. It is recommended that analysts differentiate between secondary and nonspecific blast trauma categories. Conversely, aircraft crash deceleration trauma is more widespread throughout the skeleton, with torsional fractures and injuries occurring from multiple or indeterminate directions. These traits reflect factors such as more complex loading environments than is seen in blast events. Two case studies are presented in detail to further illustrate differences in aircraft crash and blast‐related incidents. Both studies emphasize consideration of the body as a whole unit to facilitate interpretations. While the cases presented herein result from historic war‐related casualties that characterize the Defense POW/MIA Accounting Agency’s (DPAA) casework, these skeletal cases provide guidelines more appropriate than clinically derived criteria developed through assessments of soft tissue injuries. These guidelines can be used by anthropologists and pathologists working with skeletal remain from mass disasters and other complex contexts, as well as provide avenues for future research.
Ecological assessment of lakes and rivers using benthic diatom assemblages currently requires considerable taxonomic expertise to identify species using light microscopy. This traditional approach is also time-consuming. Diatom metabarcoding is a promising alternative and there is increasing interest in using this approach for routine assessment. However, until now, analysis protocols for diatom metabarcoding have been developed and optimised by research groups working in isolation. The diversity of existing bioinformatics methods highlights the need for an assessment of the performance and comparability of results of different methods. The aim of this study was to test the correspondence of outputs from six bioinformatics pipelines currently in use for diatom metabarcoding in different European countries. Raw sequence data from 29 biofilm samples were treated by each of the bioinformatics pipelines, five of them using the same curated reference database. The outputs of the pipelines were compared in terms of sequence unit assemblages, taxonomic assignment, biotic index score and ecological assessment outcomes. The three last components were also compared to outputs from traditional light microscopy, which is currently accepted for ecological assessment of phytobenthos, as required by the Water Framework Directive. We also tested the performance of the pipelines on the two DNA markers (rbcL and 18S-V4) that are currently used by the working groups participating in this study. The sequence unit assemblages produced by different pipelines showed significant differences in terms of assigned and unassigned read numbers and sequence unit numbers. When comparing the taxonomic assignments at genus and species level, correspondence of the taxonomic assemblages between pipelines was weak. Most discrepancies were linked to differential detection or quantification of taxa, despite the use of the same reference database. Subsequent calculation of biotic index scores also showed significant differences between approaches, which were reflected in the final ecological assessment. Use of the rbcL marker always resulted in better correlation among molecular datasets and also in results closer to these generated using traditional microscopy. This study shows that decisions made in pipeline design have implications for the dataset's structure and the taxonomic assemblage, which in turn may affect biotic index calculation and ecological assessment. There is a need to define best-practice bioinformatics parameters in order to ensure the best representation of diatom assemblages. Only the use of similar parameters will ensure the compatibility of data from different working groups. The future of diatom metabarcoding for ecological assessment may also lie in the development of new metrics using, for example, presence/absence instead of relative abundance data.
Surrogate models for hotspot ignition and growth rates were presented in Part I (Nassar et al., Shock Waves 29(4):537–558, 2018), where the hotspots were formed by the collapse of single cylindrical voids. Such isolated cylindrical voids are idealizations of the void morphology in real meso-structures. This paper therefore investigates the effect of non-cylindrical void shapes and void–void interactions on hotspot ignition and growth. Surrogate models capturing these effects are constructed using a Bayesian Kriging approach. The training data for machine learning the surrogates are derived from reactive void collapse simulations spanning the parameter space of void aspect ratio, void orientation (θ) (\theta ) , and void fraction (ϕ) (\phi ) . The resulting surrogate models portray strong dependence of the ignition and growth rates on void aspect ratio and orientation, particularly when they are oriented at acute angles with respect to the imposed shock. The surrogate models for void interaction effects show significant changes in hotspot ignition and growth rates as the void fraction increases. The paper elucidates the physics of hotspot evolution in void fields due to the creation and interaction of multiple hotspots. The results from this work will be useful not only for constructing meso-informed macroscale models of HMX, but also for understanding the physics of void–void interactions and sensitivity due to void shape and orientation.
Engineering systems experiencing events of amplitudes higher than 100 gn for a duration under 100 ms, here termed high-rate dynamics, can undergo rapid damaging effects. If the structural health of such systems could be accurately estimated in a timely manner, preventative measures could be employed to minimize adverse effects. For complex high-rate problems, adaptive observers have shown promise due to their capability to deal with nonstationary, noisy, and uncertain systems. However, adaptive observers have slow convergence rates, which impede their applicability to the high-rate problems. To improve on the convergence rate, we propose a variable input space concept for optimizing the use of data history of high-rate dynamics, with the objective to produce an optimal representation of the system of interest. Using the embedding theory, the algorithm sequentially selects and adapts a vector of inputs that preserves the essential dynamics of the high-rate system. In this paper, the variable input space is integrated in a wavelet neural network, which constitutes a variable input observer. The observer is simulated using experimental data from a high-rate system. Different input space adaptation methods are studied, and the performance is also compared against an optimized fixed input strategy. It is found that a smooth transition of the input space eliminates error spikes and yields faster convergence. The variable input observer is further studied in a hybrid model-/data-driven formulation, and results demonstrate significant improvement in performance gained from the added physical knowledge.
Our study of 164 diatom samples from Catalonia (NE Spain) is the first to evaluate the applicability of DNA metabarcoding, based on high throughput sequencing (HTS) using a 312-bp rbcL marker, for biomonitoring Mediterranean rivers. For this, we compared the values of a biotic index (IPS) and the ecological status classes derived from them, between light microscope-based (LM) and HTS methods. Very good correspondence between methods gives encouraging results concerning the applicability of DNA metabarcoding for Catalan rivers for the EU Water Framework Directive (WFD). However, in 10 sites, the ecological status class was downgraded from “Good”/“High” obtained by LM to “Moderate”/“Poor”/“Bad” by HTS; these “critical” sites are especially important, because the WFD requires remedial action by water managers for any river with Moderate or lower status. We investigated the contribution of each species to the IPS using a “leave-one-out” sensitivity analysis, paying special attention to critical sites. Discrepancies in IPS between LM and HTS were mainly due to the misidentification and overlooking in LM of a few species, which were better recovered by HTS. This bias was particularly important in the case of Fistulifera saprophila, whose clear underrepresentation in LM was important for explaining 8 out of the 10 critical sites and probably reflected destruction of weakly-silicified frustules during sample preparation. Differences between species in the rbcL copy number per cell affected the relative abundance obtained by HTS for Achnanthidium minutissimum, Nitzschia inconspicua and Ulnaria ulna, which were also identified by the sensitivity analysis as important for the WFD. Only minor IPS discrepancies were attributed to the incompleteness of the reference library, as most of the abundant and influential species (to the IPS) were well represented there. Finally, we propose that leave-one-out analysis is a good method for identifying priority species for isolation and barcoding.
Institution pages aggregate content on ResearchGate related to an institution. The members listed on this page have self-identified as being affiliated with this institution. Publications listed on this page were identified by our algorithms as relating to this institution. This page was not created or approved by the institution. If you represent an institution and have questions about these pages or wish to report inaccurate content, you can contact us here.
Information
Address
St. Louis, United States