首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 46 毫秒
1.
Root cause analysis can be used in foodborne illness outbreak investigations to determine the underlying causes of an outbreak and to help identify actions that could be taken to prevent future outbreaks. We developed a new tool, the Quantitative Risk Assessment-Epidemic Curve Prediction Model (QRA-EC), to assist with these goals and applied it to a case study to investigate and illustrate the utility of leveraging quantitative risk assessment to provide unique insights for foodborne illness outbreak root cause analysis. We used a 2019 Salmonella outbreak linked to melons as a case study to demonstrate the utility of this model (Centers for Disease Control and Prevention [CDC], 2019). The model was used to evaluate the impact of various root cause hypotheses (representing different contamination sources and food safety system failures in the melon supply chain) on the predicted number and timeline of illnesses. The predicted number of illnesses varied by contamination source and was strongly impacted by the prevalence and level of Salmonella contamination on the surface/inside of whole melons and inside contamination niches on equipment surfaces. The timeline of illnesses was most strongly impacted by equipment sanitation efficacy for contamination niches. Evaluations of a wide range of scenarios representing various potential root causes enabled us to identify which hypotheses, were likely to result in an outbreak of similar size and illness timeline to the 2019 Salmonella melon outbreak. The QRA-EC framework can be adapted to accommodate any food–pathogen pairs to provide insights for foodborne outbreak investigations.  相似文献   

2.
《Risk analysis》2018,38(8):1718-1737
We developed a probabilistic mathematical model for the postharvest processing of leafy greens focusing on Escherichia coli O157:H7 contamination of fresh‐cut romaine lettuce as the case study. Our model can (i) support the investigation of cross‐contamination scenarios, and (ii) evaluate and compare different risk mitigation options. We used an agent‐based modeling framework to predict the pathogen prevalence and levels in bags of fresh‐cut lettuce and quantify spread of E. coli O157:H7 from contaminated lettuce to surface areas of processing equipment. Using an unbalanced factorial design, we were able to propagate combinations of random values assigned to model inputs through different processing steps and ranked statistically significant inputs with respect to their impacts on selected model outputs. Results indicated that whether contamination originated on incoming lettuce heads or on the surface areas of processing equipment, pathogen prevalence among bags of fresh‐cut lettuce and batches was most significantly impacted by the level of free chlorine in the flume tank and frequency of replacing the wash water inside the tank. Pathogen levels in bags of fresh‐cut lettuce were most significantly influenced by the initial levels of contamination on incoming lettuce heads or surface areas of processing equipment. The influence of surface contamination on pathogen prevalence or levels in fresh‐cut bags depended on the location of that surface relative to the flume tank. This study demonstrates that developing a flexible yet mathematically rigorous modeling tool, a “virtual laboratory,” can provide valuable insights into the effectiveness of individual and combined risk mitigation options.  相似文献   

3.
We used an agent‐based modeling (ABM) framework and developed a mathematical model to explain the complex dynamics of microbial persistence and spread within a food facility and to aid risk managers in identifying effective mitigation options. The model explicitly considered personal hygiene practices by food handlers as well as their activities and simulated a spatially explicit dynamic system representing complex interaction patterns among food handlers, facility environment, and foods. To demonstrate the utility of the model in a decision‐making context, we created a hypothetical case study and used it to compare different risk mitigation strategies for reducing contamination and spread of Listeria monocytogenes in a food facility. Model results indicated that areas with no direct contact with foods (e.g., loading dock and restroom) can serve as contamination niches and recontaminate areas that have direct contact with food products. Furthermore, food handlers’ behaviors, including, for example, hygiene and sanitation practices, can impact the persistence of microbial contamination in the facility environment and the spread of contamination to prepared foods. Using this case study, we also demonstrated benefits of an ABM framework for addressing food safety in a complex system in which emergent system‐level responses are predicted using a bottom‐up approach that observes individual agents (e.g., food handlers) and their behaviors. Our model can be applied to a wide variety of pathogens, food commodities, and activity patterns to evaluate efficacy of food‐safety management practices and quantify contamination reductions associated with proposed mitigation strategies in food facilities.  相似文献   

4.
We analyze the risk of contracting illness due to the consumption in the United States of hamburgers contaminated with enterohemorrhagic Escherichia coli (EHEC) of serogroup O157 produced from manufacturing beef imported from Australia. We have used a novel approach for estimating risk by using the prevalence and concentration estimates of E. coli O157 in lots of beef that were withdrawn from the export chain following detection of the pathogen. For the purpose of the present assessment an assumption was that no product is removed from the supply chain following testing. This, together with a number of additional conservative assumptions, leads to an overestimation of E. coli O157‐associated illness attributable to the consumption of ground beef patties manufactured only from Australian beef. We predict 49.6 illnesses (95%: 0.0–148.6) from the 2.46 billion hamburgers made from 155,000 t of Australian manufacturing beef exported to the United States in 2012. All these illness were due to undercooking in the home and less than one illness is predicted from consumption of hamburgers cooked to a temperature of 68 °C in quick‐service restaurants.  相似文献   

5.
Shiga‐toxin producing Escherichia coli (STEC) strains may cause human infections ranging from simple diarrhea to Haemolytic Uremic Syndrome (HUS). The five main pathogenic serotypes of STEC (MPS‐STEC) identified thus far in Europe are O157:H7, O26:H11, O103:H2, O111:H8, and O145:H28. Because STEC strains can survive or grow during cheese making, particularly in soft cheeses, a stochastic quantitative microbial risk assessment model was developed to assess the risk of HUS associated with the five MPS‐STEC in raw milk soft cheeses. A baseline scenario represents a theoretical worst‐case scenario where no intervention was considered throughout the farm‐to‐fork continuum. The risk level assessed with this baseline scenario is the risk‐based level. The impact of seven preharvest scenarios (vaccines, probiotic, milk farm sorting) on the risk‐based level was expressed in terms of risk reduction. Impact of the preharvest intervention ranges from 76% to 98% of risk reduction with highest values predicted with scenarios combining a decrease of the number of cow shedding STEC and of the STEC concentration in feces. The impact of postharvest interventions on the risk‐based level was also tested by applying five microbiological criteria (MC) at the end of ripening. The five MCs differ in terms of sample size, the number of samples that may yield a value larger than the microbiological limit, and the analysis methods. The risk reduction predicted varies from 25% to 96% by applying MCs without preharvest interventions and from 1% to 96% with combination of pre‐ and postharvest interventions.  相似文献   

6.
Dose Response for Infection by Escherichia coli O157:H7 from Outbreak Data   总被引:1,自引:0,他引:1  
In 1996, an outbreak of E. coli O157:H7-associated illness occurred in an elementary school in Japan. This outbreak has been studied in unusual detail, making this an important case for quantitative risk assessment. The availability of stored samples of the contaminated food allowed reliable estimation of exposure to the pathogens. Collection of fecal samples allowed assessment of the numbers infected, including asymptomatic cases. Comparison to other published dose-response studies for E. coli O157:H7 show that the strain that caused the outbreak studied here must have been considerably more infectious. We use this well-documented incident as an example to demonstrate how such information on the response to a single dose can be used for dose-response assessment. In particular, we demonstrate how the high infectivity limits the uncertainty in the low-dose region.  相似文献   

7.
《Risk analysis》2018,38(3):429-441
The 2014 Ebola virus disease (EVD) outbreak affected several countries worldwide, including six West African countries. It was the largest Ebola epidemic in the history and the first to affect multiple countries simultaneously. Significant national and international delay in response to the epidemic resulted in 28,652 cases and 11,325 deaths. The aim of this study was to develop a risk analysis framework to prioritize rapid response for situations of high risk. Based on findings from the literature, sociodemographic features of the affected countries, and documented epidemic data, a risk scoring framework using 18 criteria was developed. The framework includes measures of socioeconomics, health systems, geographical factors, cultural beliefs, and traditional practices. The three worst affected West African countries (Guinea, Sierra Leone, and Liberia) had the highest risk scores. The scores were much lower in developed countries that experienced Ebola compared to West African countries. A more complex risk analysis framework using 18 measures was compared with a simpler one with 10 measures, and both predicted risk equally well. A simple risk scoring system can incorporate measures of hazard and impact that may otherwise be neglected in prioritizing outbreak response. This framework can be used by public health personnel as a tool to prioritize outbreak investigation and flag outbreaks with potentially catastrophic outcomes for urgent response. Such a tool could mitigate costly delays in epidemic response.  相似文献   

8.
In this study, we propose a time-dependent susceptible-exposed-infected-recovered (SEIR) model for the analysis of the SARS-CoV-2 epidemic outbreak in three different countries, the United States, Italy, and Iceland using public data inherent the numbers of the epidemic wave. Since several types and grades of actions were adopted by the governments, including travel restrictions, social distancing, or limitation of movement, we want to investigate how these measures can affect the epidemic curve of the infectious population. The parameters of interest for the SEIR model were estimated employing a composite likelihood approach. Moreover, standard errors have been corrected for temporal dependence. The adoption of restrictive measures results in flatten epidemic curves, and the future evolution indicated a decrease in the number of cases.  相似文献   

9.
In this work, we study the environmental and operational factors that influence airborne transmission of nosocomial infections. We link a deterministic zonal ventilation model for the airborne distribution of infectious material in a hospital ward, with a Markovian multicompartment SIS model for the infection of individuals within this ward, in order to conduct a parametric study on ventilation rates and their effect on the epidemic dynamics. Our stochastic model includes arrival and discharge of patients, as well as the detection of the outbreak by screening events or due to symptoms being shown by infective patients. For each ventilation setting, we measure the infectious potential of a nosocomial outbreak in the hospital ward by means of a summary statistic: the number of infections occurred within the hospital ward until end or declaration of the outbreak. We analytically compute the distribution of this summary statistic, and carry out local and global sensitivity analysis in order to identify the particular characteristics of each ventilation regime with the largest impact on the epidemic spread. Our results show that ward ventilation can have a significant impact on the infection spread, especially under slow detection scenarios or in overoccupied wards, and that decreasing the infection risk for the whole hospital ward might increase the risk in specific areas of the health‐care facility. Moreover, the location of the initial infective individual and the protocol in place for outbreak declaration both form an interplay with ventilation of the ward.  相似文献   

10.
The Bogotá River receives untreated wastewater from the city of Bogotá and many other towns. Downstream from Bogotá, water from the river is used for irrigation of crops. Concentrations of indicator organisms in the river are high, which is consistent with fecal contamination. To investigate the probability of illness due to exposure to enteric pathogens from the river, specifically Salmonella, we took water samples from the Bogotá River at six sampling locations in an area where untreated water from the river is used for irrigation of lettuce, broccoli, and cabbage. Salmonella concentrations were quantified by direct isolation and qPCR. Concentrations differed, depending on the quantification technique used, ranging between 107.7 and 109.9 number of copies of gene invA per L and 105.3 and 108.4 CFU/L, for qPCR and direct isolation, respectively. A quantitative microbial risk assessment model that estimates the daily risk of illness with Salmonella resulting from consuming raw unwashed vegetables irrigated with water from the Bogotá River was constructed using the Salmonella concentration data. The daily probability of illness from eating raw unwashed vegetables ranged between 0.62 and 0.85, 0.64 and 0.86, and 0.64 and 0.85 based on concentrations estimated by qPCR (0.47–0.85, 0.47–0.86, and 0.41–0.85 based on concentrations estimated by direct isolation) for lettuce, cabbage, and broccoli, respectively, which are all above the commonly propounded benchmark of 10?4 per year. Results obtained in this study highlight the necessity for appropriate wastewater treatment in the region, and emphasize the importance of postharvest practices, such as washing, disinfecting, and cooking.  相似文献   

11.
《Risk analysis》2018,38(2):392-409
The relative contributions of exposure pathways associated with cattle‐manure‐borne Escherichia coli O157:H7 on public health have yet to be fully characterized. A stochastic, quantitative microbial risk assessment (QMRA) model was developed to describe a hypothetical cattle farm in order to compare the relative importance of five routes of exposure, including aquatic recreation downstream of the farm, consumption of contaminated ground beef processed with limited interventions, consumption of leafy greens, direct animal contact, and the recreational use of a cattle pasture. To accommodate diverse environmental and hydrological pathways, existing QMRAs were integrated with novel and simplistic climate and field‐level submodels. The model indicated that direct animal contact presents the greatest risk of illness per exposure event during the high pathogen shedding period. However, when accounting for the frequency of exposure, using a high‐risk exposure‐receptor profile, consumption of ground beef was associated with the greatest risk of illness. Additionally, the model was used to evaluate the efficacy of hypothetical interventions affecting one or more exposure routes; concurrent evaluation of multiple routes allowed for the assessment of the combined effect of preharvest interventions across exposure pathways—which may have been previously underestimated—as well as the assessment of the effect of additional downstream interventions. This analysis represents a step towards a full evaluation of the risks associated with multiple exposure pathways; future incorporation of variability associated with environmental parameters and human behaviors would allow for a comprehensive assessment of the relative contribution of exposure pathways at the population level.  相似文献   

12.
Regulatory agencies often perform microbial risk assessments to evaluate the change in the number of human illnesses as the result of a new policy that reduces the level of contamination in the food supply. These agencies generally have regulatory authority over the production and retail sectors of the farm‐to‐table continuum. Any predicted change in contamination that results from new policy that regulates production practices occurs many steps prior to consumption of the product. This study proposes a framework for conducting microbial food‐safety risk assessments; this framework can be used to quantitatively assess the annual effects of national regulatory policies. Advantages of the framework are that estimates of human illnesses are consistent with national disease surveillance data (which are usually summarized on an annual basis) and some of the modeling steps that occur between production and consumption can be collapsed or eliminated. The framework leads to probabilistic models that include uncertainty and variability in critical input parameters; these models can be solved using a number of different Bayesian methods. The Bayesian synthesis method performs well for this application and generates posterior distributions of parameters that are relevant to assessing the effect of implementing a new policy. An example, based on Campylobacter and chicken, estimates the annual number of illnesses avoided by a hypothetical policy; this output could be used to assess the economic benefits of a new policy. Empirical validation of the policy effect is also examined by estimating the annual change in the numbers of illnesses observed via disease surveillance systems.  相似文献   

13.
This article summarizes a quantitative microbial risk assessment designed to characterize the public health impact of consumption of shell eggs and egg products contaminated with Salmonella Enteritidis (SE). This risk assessment's objectives were to: (1) establish the baseline risk of foodborne illness from SE, (2) identify and evaluate potential risk mitigation strategies, and (3) identify data gaps related to future research efforts. The risk assessment model has five modules. The Egg Production module estimates the number of eggs produced that are SE-contaminated. Shell Egg Processing, Egg Products Processing, and Preparation & Consumption modules estimate the increase or decrease in the numbers of SE organisms in eggs or egg products as they pass through storage, transportation, processing, and preparation. A Public Health Outcomes module then calculates the incidence of illnesses and four clinical outcomes, as well as the cases of reactive arthritis associated with SE infection following consumption. The baseline model estimates an average production of 2.3 million SE-contaminated shell eggs/year of the estimated 69 billion produced annually and predicts an average of 661,633, human illnesses per year from consumption of these eggs. The model estimates approximately 94% of these cases recover without medical care, 5% visit a physician, an additional 0.5% are hospitalized, and 0.05% result in death. The contribution of SE from commercially pasteurized egg products was estimated to be negligible. Five mitigation scenarios were selected for comparison of their individual and combined effects on the number of human illnesses. Results suggest that mitigation in only one segment of the farm-to-table continuum will be less effective than several applied in different segments. Key data gaps and areas for future research include the epidemiology of SE on farms, the bacteriology of SE in eggs, human behavior in food handling and preparation, and human responses to SE exposure.  相似文献   

14.
A recent paper by Ferrier and Buzby provides a framework for selecting the sample size when testing a lot of beef trim for Escherichia coli O157:H7 that equates the averted costs of recalls and health damages from contaminated meats sold to consumers with the increased costs of testing while allowing for uncertainty about the underlying prevalence of contamination. Ferrier and Buzby conclude that the optimal sample size is larger than the current sample size. However, Ferrier and Buzby's optimization model has a number of errors, and their simulations failed to consider available evidence about the likelihood of the scenarios explored under the model. After correctly modeling microbial prevalence as dependent on portion size and selecting model inputs based on available evidence, the model suggests that the optimal sample size is zero under most plausible scenarios. It does not follow, however, that sampling beef trim for E. coli O157:H7, or food safety sampling more generally, should be abandoned. Sampling is not generally cost effective as a direct consumer safety control measure due to the extremely large sample sizes required to provide a high degree of confidence of detecting very low acceptable defect levels. Food safety verification sampling creates economic incentives for food producing firms to develop, implement, and maintain effective control measures that limit the probability and degree of noncompliance with regulatory limits or private contract specifications.  相似文献   

15.
Data from a human feeding trial with healthy men were used to develop a dose-response model for 13 strains of Salmonella and to determine the effects of strain variation on the shape of the dose-response curve. Dose-response data for individual strains were fit to a three-phase linear model to determine minimum, median, and maximum illness doses, which were used to define Pert distributions in a computer simulation model. Pert distributions for illness dose of individual strains were combined in an Excel spreadsheet using a discrete distribution to model strain prevalence. In addition, a discrete distribution was used to model dose groups and thus create a model that simulated human feeding trials. During simulation of the model with @Risk, an illness dose and a dose consumed were randomly assigned to each consumption event in the simulated feeding trial and if the illness dose was greater than the dose consumed then the model predicted no illness, otherwise the model predicted that an illness would occur. To verify the dose-response model predictions, the original feeding trial was simulated. The dose-response model predicted a median of 69 (range of 43-101) illnesses compared to 74 in the original trial. Thus, its predictions were in agreement with the data used to develop it. However, predictions of the model are only valid for eggnog, healthy men, and the strains and doses of Salmonella used to develop it. When multiple strains of Salmonella were simulated together, the predicted dose-response curves were irregular in shape. Thus, the sigmoid shape of dose-response curves in feeding trials with one strain of Salmonella may not accurately reflect dose response in naturally contaminated food where multiple strains may be present.  相似文献   

16.
The Grunow–Finke assessment tool (GFT) is an accepted scoring system for determining likelihood of an outbreak being unnatural in origin. Considering its high specificity but low sensitivity, a modified Grunow–Finke tool (mGFT) has been developed with improved sensitivity. The mGFT has been validated against some past disease outbreaks, but it has not been applied to ongoing outbreaks. This study is aimed to score the outbreak of Middle East respiratory syndrome coronavirus (MERS-CoV) in Saudi Arabia using both the original GFT and mGFT. The publicly available data on human cases of MERS-CoV infections reported in Saudi Arabia (2012–2018) were sourced from the FluTrackers, World Health Organization, Saudi Ministry of Health, and published literature associated with MERS outbreaks investigations. The risk assessment of MERS-CoV in Saudi Arabia was analyzed using the original GFT and mGFT criteria, algorithms, and thresholds. The scoring points for each criterion were determined by three researchers to minimize the subjectivity. The results showed 40 points of total possible 54 points using the original GFT (likelihood: 74%), and 40 points of a total possible 60 points (likelihood: 67%) using the mGFT, both tools indicating a high likelihood that human MERS-CoV in Saudi Arabia is unnatural in origin. The findings simply flag unusual patterns in this outbreak, but do not prove unnatural etiology. Proof of bioattacks can only be obtained by law enforcement and intelligence agencies. This study demonstrated the value and flexibility of the mGFT in assessing and predicting the risk for an ongoing outbreak with simple criteria.  相似文献   

17.
Thomas Oscar 《Risk analysis》2021,41(1):110-130
Salmonella is a leading cause of foodborne illness (i.e., salmonellosis) outbreaks, which on occasion are attributed to ground turkey. The poultry industry uses Salmonella prevalence as an indicator of food safety. However, Salmonella prevalence is only one of several factors that determine risk of salmonellosis. Consequently, a model for predicting risk of salmonellosis from individual lots of ground turkey as a function of Salmonella prevalence and other risk factors was developed. Data for Salmonella contamination (prevalence, number, and serotype) of ground turkey were collected at meal preparation. Scenario analysis was used to evaluate effects of model variables on risk of salmonellosis. Epidemiological data were used to simulate Salmonella serotype virulence in a dose‐response model that was based on human outbreak and feeding trial data. Salmonella prevalence was 26% (n = 100) per 25 g of ground turkey, whereas Salmonella number ranged from 0 to 1.603 with a median of 0.185 log per 25 g. Risk of salmonellosis (total arbitrary units (AU) per lot) was affected (p ≤ 0.05) by Salmonella prevalence, number, and virulence, by incidence and extent of undercooking, and by food consumption behavior and host resistance but was not (p > 0.05) affected by serving size, serving size distribution, or total bacterial load of ground turkey when all other risk factors were held constant. When other risk factors were not held constant, Salmonella prevalence was not correlated (r = ?0.39; p = 0.21) with risk of salmonellosis. Thus, Salmonella prevalence alone was not a good indicator of poultry food safety because other factors were found to alter risk of salmonellosis. In conclusion, a more holistic approach to poultry food safety, such as the process risk model developed in the present study, is needed to better protect public health from foodborne pathogens like Salmonella.  相似文献   

18.
Successful identification of unnatural epidemics relies on a sensitive risk assessment tool designed for the differentiation between unnatural and natural epidemics. The Grunow–Finke tool (GFT), which has been the most widely used, however, has low sensitivity in such differentiation. We aimed to recalibrate the GFT and improve the performance in detection of unnatural epidemics. The comparator was the original GFT and its application in 11 historical outbreaks, including eight confirmed unnatural outbreaks and three natural outbreaks. Three steps were involved: (i) removing criteria, (ii) changing weighting factors, and (iii) adding and refining criteria. We created a series of alternative models to examine the changes on the parameter likelihood of unnatural outbreaks until we found a model that correctly identified all the unnatural outbreaks and natural ones. Finally, the recalibrated GFT was tested and validated with data from an unnatural and natural outbreak, respectively. A total of 238 models were tested. Through the removal of criteria, increasing or decreasing weighting factors of other criteria, adding a new criterion titled “special insights,” and setting a new threshold for likelihood, we increased the sensitivity of the GFT from 38% to 100%, and retained the specificity at 100% in detecting unnatural epidemics. Using test data from an unnatural and a natural outbreak, the recalibrated GFT correctly classified their etiology. The recalibrated GFT could be integrated into routine outbreak investigation by public health institutions and agencies responsible for biosecurity.  相似文献   

19.
To address the risk posed to human health by the consumption of VTEC O157 within contaminated pork, lamb, and beef products within Great Britain, a quantitative risk assessment model has been developed. This model aims to simulate the prevalence and amount of VTEC O157 in different meat products at consumption within a single model framework by adapting previously developed models. The model is stochastic in nature, enabling both variability (natural variation between animals, carcasses, products) and uncertainty (lack of knowledge) about the input parameters to be modeled. Based on the model assumptions and data, it is concluded that the prevalence of VTEC O157 in meat products (joints and mince) at consumption is low (i.e., <0.04%). Beef products, particularly beef burgers, present the highest estimated risk with an estimated eight out of 100,000 servings on average resulting in human infection with VTEC O157.  相似文献   

20.
Tucker Burch 《Risk analysis》2019,39(3):599-615
The assumptions underlying quantitative microbial risk assessment (QMRA) are simple and biologically plausible, but QMRA predictions have never been validated for many pathogens. The objective of this study was to validate QMRA predictions against epidemiological measurements from outbreaks of waterborne gastrointestinal disease. I screened 2,000 papers and identified 12 outbreaks with the necessary data: disease rates measured using epidemiological methods and pathogen concentrations measured in the source water. Eight of the 12 outbreaks were caused by Cryptosporidium, three by Giardia, and one by norovirus. Disease rates varied from 5.5 × 10?6 to 1.1 × 10?2 cases/person‐day, and reported pathogen concentrations varied from 1.2 × 10?4 to 8.6 × 102 per liter. I used these concentrations with single‐hit dose–response models for all three pathogens to conduct QMRA, producing both point and interval predictions of disease rates for each outbreak. Comparison of QMRA predictions to epidemiological measurements showed good agreement; interval predictions contained measured disease rates for 9 of 12 outbreaks, with point predictions off by factors of 1.0–120 (median = 4.8). Furthermore, 11 outbreaks occurred at mean doses of less than 1 pathogen per exposure. Measured disease rates for these outbreaks were clearly consistent with a single‐hit model, and not with a “two‐hit” threshold model. These results demonstrate the validity of QMRA for predicting disease rates due to Cryptosporidium and Giardia.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号