[show abstract][hide abstract] ABSTRACT: BACKGROUND: Abattoir condemnations may play an important role in a food animal syndromic surveillance system. Portion condemnation data may be particularly useful, as these data can provide more specific information on health outcomes than whole carcass condemnation data. Various seasonal, secular, disease, and non-disease factors have been previously identified to be associated with whole carcass condemnation rates in Ontario provincial abattoirs; and if ignored, may bias the results of quantitative disease surveillance methods. The objective of this study was to identify various seasonal, secular, and abattoir characteristic factors that may be associated with bovine portion condemnation rates and compare how these variables may differ from previously identified factors associated with bovine whole carcass condemnation rates. RESULTS: Data were collected from the Ontario Ministry of Agriculture, Food and Rural Affairs (OMAFRA) and the Ontario Cattlemen's Association regarding "parasitic liver" and pneumonic lung condemnation rates for different cattle classes, abattoir compliance ratings, and the monthly sales-yard price for commodity classes from 2001-2007. To control for clustering by abattoirs, multi-level Poisson modeling was used to investigate the association between the following variables and "parasitic liver" as well as pneumonic lung condemnation rates: year, season, annual abattoir audit rating, geographic region, annual abattoir operating time, annual total number of animals processed, animal class, and commodity sales price. CONCLUSIONS: In this study, "parasitic liver" condemnation rates were associated with year, season, animal class, audit rating, and region. Pneumonic lung condemnation rates were associated with year, season, animal class, region, audit rating, number of cattle processed per year, and number of weeks abattoirs processed cattle. Unlike previous models based on whole carcass condemnations, commodity price was not associated with partial condemnations in this study. The results identified material-specific predictor variables for condemnation rates. This is important for syndromic surveillance based on abattoir data and should be modeled and controlled for during quantitative surveillance analysis on a portion specific basis.
BMC Veterinary Research 06/2012; 8(1):88. · 1.86 Impact Factor
[show abstract][hide abstract] ABSTRACT: Practicing veterinarians play an important role in detecting the initial outbreak of disease in animal populations. A pilot study was conducted to determine the feasibility of a veterinary-based surveillance system for the Ontario swine industry. A total of 7 practitioners from 5 clinics agreed to submit information from July 1, 2007 to June 30, 2008. The surveillance program was evaluated in terms of timeliness, compliance, geographic coverage, and data quality. Our study showed that the veterinary-based surveillance system was acceptable to practitioners and produced useful data. The program obtained information from 25% of pig farms in Ontario during this time period. However, better communication with practitioners, more user-friendly recording systems that can be adapted to each clinic's management system, active involvement of the clinics' technical personnel, and the use of financial incentives may help to improve compliance and timeliness.
Canadian journal of veterinary research = Revue canadienne de recherche vétérinaire 10/2010; 74(4):241-51. · 1.19 Impact Factor
[show abstract][hide abstract] ABSTRACT: Ontario provincial abattoirs have the potential to be important sources of syndromic surveillance data for emerging diseases of concern to animal health, public health and food safety. The objectives of this study were to: (1) describe provincially inspected abattoirs processing cattle in Ontario in terms of the number of abattoirs, the number of weeks abattoirs process cattle, geographical distribution, types of whole carcass condemnations reported, and the distance animals are shipped for slaughter; and (2) identify various seasonal, secular, disease and non-disease factors that might bias the results of quantitative methods, such as cluster detection methods, used for food animal syndromic surveillance.
Data were collected from the Ontario Ministry of Agriculture, Food and Rural Affairs and the Ontario Cattlemen's Association regarding whole carcass condemnation rates for cattle animal classes, abattoir compliance ratings, and the monthly sales-yard price for various cattle classes from 2001-2007. To analyze the association between condemnation rates and potential explanatory variables including abattoir characteristics, season, year and commodity price, as well as animal class, negative binomial regression models were fit using generalized estimating equations (GEE) to account for autocorrelation among observations from the same abattoir. Results of the fitted model found animal class, year, season, price, and audit rating are associated with condemnation rates in Ontario abattoirs. In addition, a subset of data was used to estimate the average distance cattle are shipped to Ontario provincial abattoirs. The median distance from the farm to the abattoir was approximately 82 km, and 75% of cattle were shipped less than 100 km.
The results suggest that secular and seasonal trends, as well as some non-disease factors will need to be corrected for when applying quantitative methods for syndromic surveillance involving these data. This study also demonstrated that animals shipped to Ontario provincial abattoirs come from relatively local farms, which is important when considering the use of spatial surveillance methods for these data.
BMC Veterinary Research 01/2010; 6:42. · 1.86 Impact Factor
[show abstract][hide abstract] ABSTRACT: The North American Animal Disease Spread Model is a stochastic, spatial, state-transition simulation model for the spread of highly contagious diseases of animals. It was developed with broad international support to assist policy development and decision making involving disease incursions. User-established parameters define model behavior in terms of disease progression; disease spread by animal-to-animal contact, contact with contaminated personnel or equipment, and airborne dissemination; and the implementation of control measures such as destruction and vaccination. Resources available to implement disease control strategies, as well as the direct costs associated with these strategies, are taken into consideration. The model records a wide variety of measures of the extent of simulated outbreaks and other characteristics. The graphical interface and output visualization features also make it a useful tool for training and preparedness exercises. This model is now being used to evaluate outbreak scenarios and potential control strategies for several economically important exotic animal diseases in the United States, Canada, and elsewhere. NAADSM is freely available via the Internet at http://www.naadsm.org.
Preventive Veterinary Medicine 01/2008; 82(3-4):176-97. · 2.39 Impact Factor
[show abstract][hide abstract] ABSTRACT: The epidemiology of influenza in the North American swine population has changed since the emergence of a triple-reassortant H3N2 influenza virus. Although seen previously in North America, the Ontario swine population had likely been free of viruses of the reassortant H3N2 lineage until 2005. The objective of this study was to investigate the frequency and distribution of exposure to H1N1 and H3N2 subtypes in the Ontario finisher pig population prior to and after the H3N2 outbreak that occurred in 2005. This included investigating prevalence and spatial distribution of positive herds, assessing proportion of random variation at different hierarchical levels, and evaluating selected demographic factors and management procedures as potential risk factors. In total, 919 and 978 sera collected in cross-sectional studies from 46 and 49 finisher herds in 2004 and 2005 were tested by a H1N1 subtype-specific and a H3N2 subtype-specific commercial ELISA. For the H1N1 subtype, the point prevalence of positive herds (>3 reactors) was 19.5% and 30.6% in 2004 and 2005, respectively. For the H3N2 subtype the point prevalence of positive herds (>3 reactors) was 6.5% and 40.8% in 2004 and 2005, respectively. Sera from 2004 that were positive on H3N2 ELISA did not cross-react with any of the H3N2 variants used as antigen on a sequential HI test. Only herds positive for H3N2 subtype in 2005 clustered in space (P<0.01). The H1N1 status in 2005 was associated with the H1N1 status in 2004, and with reported distance to the nearest herd. The H3N2 status in 2005 was associated with reported distance to the nearest herd and a type of replacement gilt source. For H3N2, distance seemed to be important even after controlling for type of gilt source. Most variability in seropositivity was between herds with little variability between pens. This study confirms that in 2005, the epidemic H3N2 subtype co-circulated with endemic H1N1 subtype in the Ontario finisher herds. We concluded that in Ontario, the endemic H1N1 subtype was likely maintained through circulation within herds and sites with common flow. Whereas the transmission of epidemic H3N2 subtype was attributed to local spread, which could include different modes of direct, indirect, and airborne transmission. We emphasize the importance of establishing routine monitoring systems that would allow using molecular tools, and maintaining serum banks as a useful resource for retrospective comparisons.
Preventive Veterinary Medicine 01/2008; 83(1):24-40. · 2.39 Impact Factor
[show abstract][hide abstract] ABSTRACT: This study estimated the health burden and costs associated with gastroenteritis in the City of Hamilton (Ontario, Canada). The number of cases, number of different resource units used, and cost per resource unit were represented by probability distributions and point estimates. These were subsequently integrated in a stochastic model to estimate the overall burden and cost in the population and to depict the uncertainty of the estimates. The estimated mean annual cost per capita was Can dollar 115. The estimated mean annual cost per case was Can dollar 1,089 and was similar to other published figures. Gastroenteritis represented a significant burden in the study population, with costs high enough to justify prevention efforts. These results, currently the most accurate available estimates for a Canadian population, can inform future economic evaluations to determine the most cost effective measures for reducing the burden and cost of gastroenteritis in the community.
Journal of food protection 04/2006; 69(3):651-9. · 1.83 Impact Factor
[show abstract][hide abstract] ABSTRACT: To estimate the magnitude and distribution of self-reported, acute gastrointestinal illness in a Canadian-based population, we conducted a retrospective, cross-sectional telephone survey of approximately 3500 randomly selected residents of the city of Hamilton (Ontario, Canada) from February 2001 to February 2002. The observed monthly prevalence was 10% (95 % CI 9.94-10.14) and the incidence rate was 1.3 (95 % CI 1.1-1.4) episodes per person-year; this is within the range of estimates from other developed countries. The prevalence was higher in females and in those aged < 10 years and 20-24 years. Overall, prevalence peaked in April and October, but a different temporal distribution was observed for those aged < 10 years. Although these data were derived from one community, they demonstrate that the epidemiology of acute gastrointestinal illness in a Canadian-based population is similar to that reported for other developed countries.
Epidemiology and Infection 09/2004; 132(4):607-17. · 2.87 Impact Factor
[show abstract][hide abstract] ABSTRACT: The Reveal (Neogen Corp., Lansing, Mich.) and SafePath (SafePath Laboratories LLC, St. Paul, Minn.) tests were evaluated for their performance as beef fecal and beef carcass Escherichia coli O157:H7 monitoring tests. Agreement between these tests and a reference test system was determined using naturally contaminated bovine feces and beef carcasses. The reference system utilized immunomagnetic separation with plating onto cefixime, tellurite, sorbitol MacConkey agar, followed by colony testing using a serum agglutination test for the O157 antigen. Relative to this reference method, the Reveal test showed a sensitivity of 46% and a specificity of 82% on bovine feces and a specificity of 99% on carcass samples. The SafePath test, demonstrated a significantly higher sensitivity at 79% and a similar specificity of 79%. On carcass samples the SafePath test performed similarly to the Reveal test, demonstrating a specificity of 100% relative to the reference system. There was an insufficient number of E. coli O157-positive carcass samples to estimate precisely the sensitivity of these two methods. Both methods show promise as rapid carcass monitoring tests, but further field testing to estimate sensitivity is needed to complete their evaluation. The proportion of positive fecal samples for E. coli O157:H7 by the reference method ranged from 10.2% to 36% in 10 lots of cattle with an overall mean of 17.3% (39/225). Quarter carcass sponging of 125 carcasses revealed 1.6% positive for the pathogen (2/125).
Journal of food protection 08/2000; 63(7):860-6. · 1.83 Impact Factor
[show abstract][hide abstract] ABSTRACT: High pressure liquid chromatographic methods were used for measurement of the concentration of vitamin C and β-carotene in broccoli and green pepper. The effects of processing, packaging and storage on the levels of these nutrients in both unprocessed and processed ready-to-use (RTU) vegetables were determined. Systems investigated included: (a) unpacked and pillow packaged broccoli, and (b) unpacked, pillow, partial vacuum, and total vacuum packaged green pepper. There was a statistically significant decrease in vitamin C over a 10 day storage period of unpacked and packaged vegetables including all four packaging systems (P<0.001, overall average decrease of 11%). The overall loss of β-carotene during the 10 day storage period was not statistically significant (P=0.14). Although there was a significant loss in vitamin C during storage, in most cases there was no difference in loss of vitamin C or β-carotene between the processed and unprocessed vegetables, and the packaging systems.
[show abstract][hide abstract] ABSTRACT: A stochastic simulation model was used to assess the benefit of measures implemented in the pre-slaughter period that are aimed at reducing the contamination of beef carcasses with Shiga-like-toxin-producing Escherichia coli O157. The scenario studied was based on an abattoir processing approximately 1000 head of lot-fed cattle per day. Input assumptions were described using probability distributions to reflect uncertainty in their true values. Control measures that were assessed were based on either a reduction in herd prevalence of infection, reduction in opportunity for cross-contamination in the processing plant by re-ordering of the slaughter queue, reduction of concentration of E. coli O157 in fresh faeces, or a reduction in the amount of faeces, mud and bedding ('tag') transferred from the hide to the carcass. Some control measures evaluated were hypothetical in nature and were included to assist with the planning of research priorities. Simulations suggested that the greatest potential impact is associated with vaccination and with an agent that reduces shedding E. coli O157 in faeces. Knowledge of herd-test results obtained by testing a sample of animals from the herd provides only a minor advantage in control programmes, although application of a rapid test to all animals in all lots might be of some benefit. Under most scenarios, there is ample opportunity for cross-contamination to occur within the slaughter plant as a result of early entry of cattle contaminated with E. coli O157. An industry-wide reduction in the amount of tag attached to hides and addition of a source of cattle having a prolonged average fasting time were not predicted to have a large impact on mean amount of carcass contamination with E. coli O157.
Preventive Veterinary Medicine 07/1999; 41(1):55-74. · 2.39 Impact Factor
[show abstract][hide abstract] ABSTRACT: A Monte Carlo simulation model was constructed for assessing the quantity of microbial hazards deposited on cattle carcasses under different pre-slaughter management regimens. The model permits comparison of industry-wide and abattoir-based mitigation strategies and is suitable for studying pathogens such as Escherichia coli O157:H7 and Salmonella spp. Simulations are based on a hierarchical model structure that mimics important aspects of the cattle population prior to slaughter. Stochastic inputs were included so that uncertainty about important input assumptions (such as prevalence of a human pathogen in the live cattle-population) would be reflected in model output. Control options were built into the model to assess the benefit of having prior knowledge of animal or herd-of-origin pathogen status (obtained from the use of a diagnostic test). Similarly, a facility was included for assessing the benefit of re-ordering the slaughter sequence based on the extent of external faecal contamination. Model outputs were designed to evaluate the performance of an abattoir in a 1-day period and included outcomes such as the proportion of carcasses contaminated with a pathogen, the daily mean and selected percentiles of pathogen counts per carcass, and the position of the first infected animal in the slaughter run. A measure of the time rate of introduction of pathogen into the abattoir was provided by assessing the median, 5th percentile, and 95th percentile cumulative pathogen counts at 10 equidistant points within the slaughter run. Outputs can be graphically displayed as frequency distributions, probability densities, cumulative distributions or x-y plots. The model shows promise as an inexpensive method for evaluating pathogen control strategies such as those forming part of a Hazard Analysis and Critical Control Point (HACCP) system.
Preventive Veterinary Medicine 07/1999; 41(1):37-54. · 2.39 Impact Factor
[show abstract][hide abstract] ABSTRACT: A study was conducted to provide a quantitative description of the amount of tag (mud, soil, and bedding) adhered to the hides of feedlot beef cattle and to appraise the statistical reliability of a subjective rating system for assessing this trait. Initially, a single rater obtained baseline data by assessing 2,417 cattle for 1 month at an Ontario beef processing plant. Analysis revealed that there was a strong tendency for animals within sale-lots to have a similar total tag score (intralot correlation = 0.42). Baseline data were summarized by fitting a linear model describing an individual's total tag score as the sum of their lot mean tag score (LMTS) plus an amount representing normal variation within the lot. LMTSs predicted by the linear model were adequately described by a beta distribution with parameters nu = 3.12 and omega = 5.82 scaled to fit on the 0-to-9 interval. Five raters, trained in use of the tag scoring system, made 1,334 tag score observations in a commercial abattoir, allowing reliability to be assessed at the individual level and at the lot level. High values for reliability were obtained for individual total tag score (0.84) and lot total tag score (0.83); these values suggest that the tag scoring system could be used in the marketing and slaughter of Ontario beef cattle to improve the cleanliness of animals presented for slaughter in an effort to control the entry of microbial contamination into abattoirs. Implications for the use of the tag scoring system in research are discussed.
Journal of food protection 06/1999; 62(5):520-5. · 1.83 Impact Factor
[show abstract][hide abstract] ABSTRACT: The performances of five automated microbial identification systems, relative to that of a reference identification system, for their ability to accurately and repeatedly identify six common food-borne pathogens were assessed. The systems assessed were the MicroLog system (Biolog Inc., Hayward, Calif.), the Microbial Identification System (MIS; MIDI Inc., Newark, Del.), the VITEK system (bioMérieux Vitek, Hazelwood, Mo.), the MicroScan WalkAway 40 system (Dade-MicroScan International, West Sacramento, Calif.), and the Replianalyzer system (Oxoid Inc., Nepean, Ontario, Canada). The sensitivities and specificities of these systems for the identification of food-borne isolates of Bacillus cereus, Campylobacter jejuni, Listeria monocytogenes, Staphylococcus aureus, Salmonella spp., and verotoxigenic Escherichia coli were determined with 40 reference positive isolates and 40 reference negative isolates for each pathogen. The sensitivities of these systems for the identification of these pathogens ranged from 42.5 to 100%, and the specificities of these systems for the identification of these pathogens ranged from 32.5 to 100%. Some of the systems had difficulty correctly identifying the reference isolates when the results were compared to those from the reference identification tests. The sensitivity of MIS for the identification of S. aureus, B. cereus, E. coli, and C. jejuni, for example, ranged from 47.5 to 72. 5%. The sensitivity of the Microlog system for the identification of E. coli was 72.5%, and the sensitivity of the VITEK system for the identification of B. cereus was 42.5%. The specificities of four of the five systems for the identification of all of the species tested with the available databases were greater than or equal to 97.5%; the exception was MIS for the identification of C. jejuni, which displayed a specificity of 32.5% when it was tested with reference negative isolates including Campylobacter coli and other Campylobacter species. All systems had >80% sensitivities for the identification of Salmonella species and Listeria species at the genus level. The repeatability of these systems for the identification of test isolates ranged from 30 to 100%. Not all systems included all six pathogens in their databases; thus, some species could not be tested with all systems. The choice of automated microbial identification system for the identification of a food-borne pathogen would depend on the availability of identification libraries within the systems and the performance of the systems for the identification of the pathogen.
Journal of Clinical Microbiology 04/1999; 37(4):944-9. · 4.07 Impact Factor
[show abstract][hide abstract] ABSTRACT: The Lactek test, marketed for antimicrobial residue detection in milk, was validated for the detection of antimicrobial residues in tissues. A previous study found that the LacTek test could confidently identify tissue samples spiked with antimicrobial residues. However, the test could not reliably distinguish violative from nonviolative spiked samples relative to Canadian maximum residue limits (MRLs). The objectives of this study were to assess and compare the performance of the LacTek tests for beta-lactams, tetracyclines, gentamicin, and sulfamethazine on samples containing naturally incurred residues by running the test in parallel with the standard microbial inhibition test (MIT) presently used for the routine testing of tissues at our facility and to assess the agreement with high pressure liquid chromatographic (HPLC) determinative methods. Parallel testing with the official MIT found that the Lactek tests could be confidently used for testing tissue samples containing incurred residues. Among 1,008 MIT-positive samples, the LacTek test found that 90% contained beta-lactams and/or tetracyclines. A further 7.3% of violative residues could not be identified to an antimicrobial class. In addition, 9% of samples testing negative on the MIT were found to contain an antimicrobial residue by the LacTek tests. Comparative testing with HPLC methods found that there was very good agreement between the two tests and that most violations were due to penicillin G and oxytetracycline. Although the LacTek test cannot be used to distinguish violative from nonviolative residue levels, it does offer several advantages over the present MIT. These include speed, ease of use, the ability to identify residues to a specific class, and an improved sensitivity at the MRL level for the most commonly found antimicrobials in tissue.
Journal of food protection 09/1998; 61(8):1018-22. · 1.83 Impact Factor
[show abstract][hide abstract] ABSTRACT: This paper presents a historical review of antimicrobial use in food animals, the causes of residues in meat and milk, the types of residues found, their regulation in Canada, tests used for their detection, and test performance parameters, with an emphasis on immunoassay techniques. The development of residue detection methods began shortly after the introduction of antimicrobials to food animal production in the late 1940s. From initial technical concerns expressed by the dairy industry to the present public health and international trade implications, there has been an ongoing need for reliable, sensitive, and economical methods for the detection of antimicrobial residues in food animal products such as milk and meat. Initially there were microbial growth inhibition tests, followed by more sensitive and specific methods based on receptor binding, immunochemical, and chromatographic principle. An understanding of basic test performance parameters and their implications is essential when choosing an analytical strategy for residue testing. While each test format has its own attributes, none test will meet all the required analytical needs. Therefore the use of a tiered or integrated system employing assays designated for screening and confirmation is necessary to ensure that foods containing violative residues are not introduced into the food chain.
Journal of food protection 07/1998; 61(6):742-56. · 1.83 Impact Factor
[show abstract][hide abstract] ABSTRACT: Raw (unpasteurized) milk can be a source of food-borne pathogens. Raw milk consumption results in sporadic disease outbreaks. Pasteurization is designed to destroy all bacterial pathogens common to raw milk, excluding spore-forming bacteria and possibly Mycobacterium paratuberculosis, but some people continue to drink raw milk, believing it to be safe. Current methods for assessing the bacteriological quality of raw milk, such as aerobic plate counts, are not usually designed to detect specific pathogens. The objective of this study was to estimate the proportion of pick-ups (loads of raw milk from a single farm bulk tank) from Ontario farm bulk tanks that contained Listeria monocytogenes, Salmonella spp., Campylobacter spp., and/or verotoxigenic Escherichia coli (VTEC). Samples from 1,720 pick-ups of raw milk were tested for the presence of these pathogens, and 47 L. monocytogenes, three Salmonella spp., eight Campylobacter spp., and 15 VTEC isolates were detected, representing 2.73, 0.17, 0.47, and 0.87% of milk samples, respectively. Estimates of the proportion of theoretical tanker truck loads of pooled raw milk contaminated with pathogens ranged from a low of 0.51% of tankers containing raw milk from 3 bulk tanks being contaminated with Salmonella spp. to a high of 34.41% of tankers containing raw milk from 10 bulk tanks being contaminated with at least one of the pathogens. Associations between the presence of pathogens and raw milk sample characteristics were investigated. The mean somatic cell count was higher among VTEC- or L. monocytogenes-positive samples, and the mean aerobic plate count was found to be higher among L. monocytogenes-positive samples. These results confirm the presence of bacterial food pathogens in raw milk and emphasize the importance of continued diligence in the application of hygiene programs within dairies and the separation of raw milk from pasteurized milk and milk products.
Journal of food protection 10/1997; 60(11):1341-1346. · 1.83 Impact Factor
[show abstract][hide abstract] ABSTRACT: The authors provide an overview of non-biological contaminants in foods from animals. These contaminants comprise chemical and physical hazards which may be introduced during animal production, slaughter and processing or packaging. Emphasis in this paper is placed on those residues which are of most interest to Veterinary Services and for which Veterinary Services have responsibility, namely: residues of veterinary drugs, industrial chemicals, heavy metals and pesticides which may be introduced during animal production. The most contentious residues which occur in meat, milk and eggs are antibacterial drugs, hormonal growth promoters and certain pesticides, heavy metals and industrial chemicals. While rare incidents of human disease have been attributed to hazardous levels of these contaminants in milk and meat, residues of chemical contaminants in foods of animal origin are, in general, rarely detected at more than trace levels and consequently are not of major public health concern. Nevertheless, non-biological contaminants continue to be very important with respect to international trade and consumer confidence, and efforts to reduce the incidence of occurrence in foods is warranted. Furthermore, continued monitoring and periodic reassessment of risks posed by these contaminants is needed to detect or anticipate new problems so that appropriate action can be taken in the interests of public safety.
Revue scientifique et technique (International Office of Epizootics) 08/1997; 16(2):684-93. · 0.69 Impact Factor
[show abstract][hide abstract] ABSTRACT: Differentiation of strains within bacterial species, based on gas chromatographic analysis of whole-cell fatty acid profiles, was assessed with 115 strains of verotoxigenic Escherichia coli and 315 strains of Salmonella enteritidis. Fatty acid-based subgroups within each of the two species were generated. Variability of fatty acid profiles observed in repeat preparations from the same strain approached that observed between subgroups, limiting the usefulness of using fatty acid profiles to subgroup verotoxigenic E. coli and S. enteritidis strains.
Applied and Environmental Microbiology 03/1997; 63(2):757-60. · 3.68 Impact Factor