首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Listeria monocytogenes is a leading cause of hospitalization, fetal loss, and death due to foodborne illnesses in the United States. A quantitative assessment of the relative risk of listeriosis associated with the consumption of 23 selected categories of ready‐to‐eat foods, published by the U.S. Department of Health and Human Services and the U.S. Department of Agriculture in 2003, has been instrumental in identifying the food products and practices that pose the greatest listeriosis risk and has guided the evaluation of potential intervention strategies. Dose‐response models, which quantify the relationship between an exposure dose and the probability of adverse health outcomes, were essential components of the risk assessment. However, because of data gaps and limitations in the available data and modeling approaches, considerable uncertainty existed. Since publication of the risk assessment, new data have become available for modeling L. monocytogenes dose‐response. At the same time, recent advances in the understanding of L. monocytogenes pathophysiology and strain diversity have warranted a critical reevaluation of the published dose‐response models. To discuss strategies for modeling L. monocytogenes dose‐response, the Interagency Risk Assessment Consortium (IRAC) and the Joint Institute for Food Safety and Applied Nutrition (JIFSAN) held a scientific workshop in 2011 (details available at http://foodrisk.org/irac/events/ ). The main findings of the workshop and the most current and relevant data identified during the workshop are summarized and presented in the context of L. monocytogenes dose‐response. This article also discusses new insights on dose‐response modeling for L. monocytogenes and research opportunities to meet future needs.  相似文献   

2.
Evaluations of Listeria monocytogenes dose‐response relationships are crucially important for risk assessment and risk management, but are complicated by considerable variability across population subgroups and L. monocytogenes strains. Despite difficulties associated with the collection of adequate data from outbreak investigations or sporadic cases, the limitations of currently available animal models, and the inability to conduct human volunteer studies, some of the available data now allow refinements of the well‐established exponential L. monocytogenes dose response to more adequately represent extremely susceptible population subgroups and highly virulent L. monocytogenes strains. Here, a model incorporating adjustments for variability in L. monocytogenes strain virulence and host susceptibility was derived for 11 population subgroups with similar underlying comorbidities using data from multiple sources, including human surveillance and food survey data. In light of the unique inherent properties of L. monocytogenes dose response, a lognormal‐Poisson dose‐response model was chosen, and proved able to reconcile dose‐response relationships developed based on surveillance data with outbreak data. This model was compared to a classical beta‐Poisson dose‐response model, which was insufficiently flexible for modeling the specific case of L. monocytogenes dose‐response relationships, especially in outbreak situations. Overall, the modeling results suggest that most listeriosis cases are linked to the ingestion of food contaminated with medium to high concentrations of L. monocytogenes. While additional data are needed to refine the derived model and to better characterize and quantify the variability in L. monocytogenes strain virulence and individual host susceptibility, the framework derived here represents a promising approach to more adequately characterize the risk of listeriosis in highly susceptible population subgroups.  相似文献   

3.
Consumer Phase Risk Assessment for Listeria monocytogenes in Deli Meats   总被引:1,自引:0,他引:1  
The foodborne disease risk associated with the pathogen Listeria monocytogenes has been the subject of recent efforts in quantitative microbial risk assessment. Building upon one of these efforts undertaken jointly by the U.S. Food and Drug Administration and the U.S. Department of Agriculture (USDA), the purpose of this work was to expand on the consumer phase of the risk assessment to focus on handling practices in the home. One-dimensional Monte Carlo simulation was used to model variability in growth and cross-contamination of L. monocytogenes during food storage and preparation of deli meats. Simulations approximated that 0.3% of the servings were contaminated with >10(4) CFU/g of L. monocytogenes at the time of consumption. The estimated mean risk associated with the consumption of deli meats for the intermediate-age population was approximately 7 deaths per 10(11) servings. Food handling in homes increased the estimated mean mortality by 10(6)-fold. Of all the home food-handling practices modeled, inadequate storage, particularly refrigeration temperatures, provided the greatest contribution to increased risk. The impact of cross-contamination in the home was considerably less. Adherence to USDA Food Safety and Inspection Service recommendations for consumer handling of ready-to-eat foods substantially reduces the risk of listeriosis.  相似文献   

4.
Semisoft cheese made from raw sheep's milk is traditionally and economically important in southern Europe. However, raw milk cheese is also a known vehicle of human listeriosis and contamination of sheep cheese with Listeria monocytogenes has been reported. In the present study, we have developed and applied a quantitative risk assessment model, based on available evidence and challenge testing, to estimate risk of invasive listeriosis due to consumption of an artisanal sheep cheese made with raw milk collected from a single flock in central Italy. In the model, contamination of milk may originate from the farm environment or from mastitic animals, with potential growth of the pathogen in bulk milk and during cheese ripening. Based on the 48‐day challenge test of a local semisoft raw sheep's milk cheese we found limited growth only during the initial phase of ripening (24 hours) and no growth or limited decline during the following ripening period. In our simulation, in the baseline scenario, 2.2% of cheese servings are estimated to have at least 1 colony forming unit (CFU) per gram. Of these, 15.1% would be above the current E.U. limit of 100 CFU/g (5.2% would exceed 1,000 CFU/g). Risk of invasive listeriosis per random serving is estimated in the 10?12 range (mean) for healthy adults, and in the 10?10 range (mean) for vulnerable populations. When small flocks (10–36 animals) are combined with the presence of a sheep with undetected subclinical mastitis, risk of listeriosis increases and such flocks may represent a public health risk.  相似文献   

5.
A novel extension of traditional growth models for exposure assessment of food-borne microbial pathogens was developed to address the complex interactions of competing microbial populations in foods. Scenarios were designed for baseline refrigeration and mild abuse of servings of chicken broiler and ground beef Our approach employed high-quality data for microbiology of foods at production, refrigerated storage temperatures, and growth kinetics of microbial populations in culture media. Simple parallel models were developed for exponential growth of multiple pathogens and the abundant and ubiquitous nonpathogenic indigenous microbiota. Monte Carlo simulations were run for unconstrained growth and growth with the density-dependent constraint based on the "Jameson effect," inhibition of pathogen growth when the indigenous microbiota reached 10(9) counts per serving. The modes for unconstrained growth of the indigenous microbiota were 10(8), 10(10), and 10(11) counts per serving for chicken broilers, and 10(7), 10(9) and 10(11) counts per serving for ground beef at respective sites for backroom, meat case, and home refrigeration. Contamination rates and likelihoods of reaching temperatures supporting growth of the pathogens in the baseline refrigeration scenario were rare events. The unconstrained exponential growth models appeared to overestimate L. monocytogenes growth maxima for the baseline refrigeration scenario by 1500-7233% (10(6)-10(7) counts/serving) when the inhibitory effects of the indigenous microbiota are ignored. The extreme tails of the distributions for the constrained models appeared to overestimate growth maxima 110% (10(4)-10(5) counts/serving) for Salmonella spp. and 108% (6 x 10(3) counts/serving) for E. coli O157:H7 relative to the extremes of the unconstrained models. The approach of incorporating parallel models for pathogens and the indigenous microbiota into exposure assessment modeling motivates the design of validation studies to test the modeling assumptions, consistent with the analytical-deliberative process of risk analysis.  相似文献   

6.
One‐third of the annual cases of listeriosis in the United States occur during pregnancy and can lead to miscarriage or stillbirth, premature delivery, or infection of the newborn. Previous risk assessments completed by the Food and Drug Administration/the Food Safety Inspection Service of the U.S. Department of Agriculture/the Centers for Disease Control and Prevention (FDA/USDA/CDC)( 1 ) and Food and Agricultural Organization/the World Health Organization (FAO/WHO)( 2 ) were based on dose‐response data from mice. Recent animal studies using nonhuman primates( 3 , 4 ) and guinea pigs( 5 ) have both estimated LD50s of approximately 107 Listeria monocytogenes colony forming units (cfu). The FAO/WHO( 2 ) estimated a human LD50 of 1.9 × 106 cfu based on data from a pregnant woman consuming contaminated soft cheese. We reevaluated risk based on dose‐response curves from pregnant rhesus monkeys and guinea pigs. Using standard risk assessment methodology including hazard identification, exposure assessment, hazard characterization, and risk characterization, risk was calculated based on the new dose‐response information. To compare models, we looked at mortality rate per serving at predicted doses ranging from 10?4 to 1012 L. monocytogenes cfu. Based on a serving of 106 L. monocytogenes cfu, the primate model predicts a death rate of 5.9 × 10?1 compared to the FDA/USDA/CDC (fig. IV‐12)( 1 ) predicted rate of 1.3 × 10?7. Based on the guinea pig and primate models, the mortality rate calculated by the FDA/USDA/CDC( 1 ) is underestimated for this susceptible population.  相似文献   

7.
Increasing evidence suggests that persistence of Listeria monocytogenes in food processing plants has been the underlying cause of a number of human listeriosis outbreaks. This study extracts criteria used by food safety experts in determining bacterial persistence in the environment, using retail delicatessen operations as a model. Using the Delphi method, we conducted an expert elicitation with 10 food safety experts from academia, industry, and government to classify L. monocytogenes persistence based on environmental sampling results collected over six months for 30 retail delicatessen stores. The results were modeled using variations of random forest, support vector machine, logistic regression, and linear regression; variable importance values of random forest and support vector machine models were consolidated to rank important variables in the experts’ classifications. The duration of subtype isolation ranked most important across all expert categories. Sampling site category also ranked high in importance and validation errors doubled when this covariate was removed. Support vector machine and random forest models successfully classified the data with average validation errors of 3.1% and 2.2% (n = 144), respectively. Our findings indicate that (i) the frequency of isolations over time and sampling site information are critical factors for experts determining subtype persistence, (ii) food safety experts from different sectors may not use the same criteria in determining persistence, and (iii) machine learning models have potential for future use in environmental surveillance and risk management programs. Future work is necessary to validate the accuracy of expert and machine classification against biological measurement of L. monocytogenes persistence.  相似文献   

8.
According to Codex Alimentarius Commission recommendations, management options applied at the process production level should be based on good hygiene practices, HACCP system, and new risk management metrics such as the food safety objective. To follow this last recommendation, the use of quantitative microbiological risk assessment is an appealing approach to link new risk‐based metrics to management options that may be applied by food operators. Through a specific case study, Listeria monocytogenes in soft cheese made from pasteurized milk, the objective of the present article is to practically show how quantitative risk assessment could be used to direct potential intervention strategies at different food processing steps. Based on many assumptions, the model developed estimates the risk of listeriosis at the moment of consumption taking into account the entire manufacturing process and potential sources of contamination. From pasteurization to consumption, the amplification of a primo‐contamination event of the milk, the fresh cheese or the process environment is simulated, over time, space, and between products, accounting for the impact of management options, such as hygienic operations and sampling plans. A sensitivity analysis of the model will help orientating data to be collected prioritarily for the improvement and the validation of the model. What‐if scenarios were simulated and allowed for the identification of major parameters contributing to the risk of listeriosis and the optimization of preventive and corrective measures.  相似文献   

9.
The current quantitative risk assessment model followed the framework proposed by the Codex Alimentarius to provide an estimate of the risk of human salmonellosis due to consumption of chicken breasts which were bought from Canadian retail stores and prepared in Canadian domestic kitchens. The model simulated the level of Salmonella contamination on chicken breasts throughout the retail‐to‐table pathway. The model used Canadian input parameter values, where available, to represent risk of salmonellosis. From retail until consumption, changes in the concentration of Salmonella on each chicken breast were modeled using equations for growth and inactivation. The model predicted an average of 318 cases of salmonellosis per 100,000 consumers per year. Potential reasons for this overestimation were discussed. A sensitivity analysis showed that concentration of Salmonella on chicken breasts at retail and food hygienic practices in private kitchens such as cross‐contamination due to not washing cutting boards (or utensils) and hands after handling raw meat along with inadequate cooking contributed most significantly to the risk of human salmonellosis. The outcome from this model emphasizes that responsibility for protection from Salmonella hazard on chicken breasts is a shared responsibility. Data needed for a comprehensive Canadian Salmonella risk assessment were identified for future research.  相似文献   

10.
The preservation of perishable food via refrigeration in the supply chain is essential to extend shelf life and provide consumers with safe food. However, electricity consumed in refrigeration processes has an economical and an environmental impact. This study focuses on the cold chain of cooked ham, including transport, cold room in supermarket, display cabinet, transport by consumer, and domestic refrigerator, and aims to predict the risk for human health associated with Listeria monocytogenes, the amount of food wasted due to the growth of spoilage bacteria, and the electrical consumption to maintain product temperature through the cold chain. A set of eight intervention actions were tested to evaluate their impact on the three criteria. Results show that the modification of the thermostat of the domestic refrigerator has a high impact on food safety and food waste and a limited impact on the electrical consumption. Inversely, the modification of the airflow rate in the display cabinet has a high impact on electrical consumption and a limited impact on food safety and food waste. A cost–benefit analysis approach and two multicriteria decision analysis methods were used to rank the intervention actions. These three methodologies show that setting the thermostat of the domestic refrigerator to 4 °C presents the best compromise between the three criteria. The impact of decisionmaker preferences (criteria weight) and limitations of these three approaches are discussed. The approaches proposed by this study may be useful in decision making to evaluate global impact of intervention actions in issues involving conflicting outputs.  相似文献   

11.
Cross-contamination and undercooking are major factors responsible for campylobacteriosis and as such should be incorporated in microbiological risk assessment. A previous paper by van Asselt et al. ( 1 ) quantified cross-contamination routes from chicken breast fillet via hand, cutting board, and knife ending up in a prepared chicken-curry salad in the domestic kitchen. The aim of the current article was to validate the obtained transfer rates with consumer data obtained by video observations and microbial analyses of a home prepared chicken-curry salad. Results showed a wide range of microbial contamination levels in the final salad, caused by various cross-contamination practices and heating times varying from 2'44" to 41'30". Model predictions indicated that cooking times should be at least 8 minutes and cutting boards need to be changed after cutting raw chicken in order to obtain safe bacterial levels in the final salad. The model predicted around 75% of the variance in cross-contamination behavior. Accuracy of the model can further be improved by including other cross-contamination routes besides hands, cutting boards, and knives. The model proved to be fail-safe, which implies it can be used as a worst-case estimate to assess the importance of cross-contamination in the home.  相似文献   

12.
The public health significance of transmission of ESBL‐producing Escherichia coli and Campylobacter from poultry farms to humans through flies was investigated using a worst‐case risk model. Human exposure was modeled by the fraction of contaminated flies, the number of specific bacteria per fly, the number of flies leaving the poultry farm, and the number of positive poultry houses in the Netherlands. Simplified risk calculations for transmission through consumption of chicken fillet were used for comparison, in terms of the number of human exposures, the total human exposure, and, for Campylobacter only, the number of human cases of illness. Comparing estimates of the worst‐case risk of transmission through flies with estimates of the real risk of chicken fillet consumption, the number of human exposures to ESBL‐producing E. coli was higher for chicken fillet as compared with flies, but the total level of exposure was higher for flies. For Campylobacter, risk values were nearly consistently higher for transmission through flies than for chicken fillet consumption. This indicates that the public health risk of transmission of both ESBL‐producing E. coli and Campylobacter to humans through flies might be of importance. It justifies further modeling of transmission through flies for which additional data (fly emigration, human exposure) are required. Similar analyses of other environmental transmission routes from poultry farms are suggested to precede further investigations into flies.  相似文献   

13.
Jocelyne Rocourt 《Risk analysis》2012,32(10):1798-1819
We used a quantitative microbiological risk assessment model to describe the risk of Campylobacter and Salmonella infection linked to chicken meals prepared in households in Dakar, Senegal. The model uses data collected specifically for this study, such as the prevalence and level of bacteria on the neck skin of chickens bought in Dakar markets, time‐temperature profiles recorded from purchase to consumption, an observational survey of meal preparation in private kitchens, and detection and enumeration of pathogens on kitchenware and cooks’ hands. Thorough heating kills all bacteria present on chicken during cooking, but cross‐contamination of cooked chicken or ready‐to‐eat food prepared for the meal via kitchenware and cooks’ hands leads to a high expected frequency of pathogen ingestion. Additionally, significant growth of Salmonella is predicted during food storage at ambient temperature before and after meal preparation. These high exposures lead to a high estimated risk of campylobacteriosis and/or salmonellosis in Dakar households. The public health consequences could be amplified by the high level of antimicrobial resistance of Salmonella and Campylobacter observed in this setting. A significant decrease in the number of ingested bacteria and in the risk could be achieved through a reduction of the prevalence of chicken contamination at slaughter, and by the use of simple hygienic measures in the kitchen. There is an urgent need to reinforce the hygiene education of food handlers in Senegal.  相似文献   

14.
We develop a prioritization framework for foodborne risks that considers public health impact as well as three other factors (market impact, consumer risk acceptance and perception, and social sensitivity). Canadian case studies are presented for six pathogen‐food combinations: Campylobacter spp. in chicken; Salmonella spp. in chicken and spinach; Escherichia coli O157 in spinach and beef; and Listeria monocytogenes in ready‐to‐eat meats. Public health impact is measured by disability‐adjusted life years and the cost of illness. Market impact is quantified by the economic importance of the domestic market. Likert‐type scales are used to capture consumer perception and acceptance of risk and social sensitivity to impacts on vulnerable consumer groups and industries. Risk ranking is facilitated through the development of a knowledge database presented in the format of info cards and the use of multicriteria decision analysis (MCDA) to aggregate the four factors. Three scenarios representing different stakeholders illustrate the use of MCDA to arrive at rankings of pathogen‐food combinations that reflect different criteria weights. The framework provides a flexible instrument to support policymakers in complex risk prioritization decision making when different stakeholder groups are involved and when multiple pathogen‐food combinations are compared.  相似文献   

15.
To prevent and control foodborne diseases, there is a fundamental need to identify the foods that are most likely to cause illness. The goal of this study was to rank 25 commonly consumed food products associated with Salmonella enterica contamination in the Central Region of Mexico. A multicriteria decision analysis (MCDA) framework was developed to obtain an S. enterica risk score for each food product based on four criteria: probability of exposure to S. enterica through domestic food consumption (Se); S. enterica growth potential during home storage (Sg); per capita consumption (Pcc); and food attribution of S. enterica outbreak (So). Risk scores were calculated by the equation Se*W1+Sg*W2+Pcc*W3+So*W4, where each criterion was assigned a normalized value (1–5) and the relative weights (W) were defined by 22 experts’ opinion. Se had the largest effect on the risk score being the criterion with the highest weight (35%; IC95% 20%–60%), followed by So (24%; 5%–50%), Sg (23%; 10%–40%), and Pcc (18%; 10%–35%). The results identified chicken (4.4 ± 0.6), pork (4.2 ± 0.6), and beef (4.2 ± 0.5) as the highest risk foods, followed by seed fruits (3.6 ± 0.5), tropical fruits (3.4 ± 0.4), and dried fruits and nuts (3.4 ± 0.5), while the food products with the lowest risk were yogurt (2.1 ± 0.3), chorizo (2.1 ± 0.4), and cream (2.0 ± 0.3). Approaches with expert-based weighting and equal weighting showed good correlation (R= 0.96) and did not show significant differences among the ranking order in the top 20 tier. This study can help risk managers select interventions and develop targeted surveillance programs against S. enterica in high-risk food products.  相似文献   

16.
This article reports a quantitative risk assessment of human listeriosis linked to the consumption of soft cheeses made from raw milk. Risk assessment was based on data purposefully acquired inclusively over the period 2000-2001 for two French cheeses, namely: Camembert of Normandy and Brie of Meaux. Estimated Listeria monocytogenes concentration in raw milk was on average 0.8 and 0.3 cells/L, respectively, in Normandy and Brie regions. A Monte Carlo simulation was used to account for the time-temperature history of the milk and cheeses from farm to table. It was assumed that cell progeny did not spread within the solid cheese matrix (as they would be free to do in liquid broth). Interaction between pH and temperature was accounted for in the growth model. The simulated proportion of servings with no L. monocytogenes cell was 88% for Brie and 82% for Camembert. The 99th percentile of L. monocytogenes cell numbers in servings of 27 g of cheese was 131 for Brie and 77 for Camembert at the time of consumption, corresponding respectively to three and five cells of L. monocytogenes per gram. The expected number of severe listeriosis cases would be < or =10(-3) and < or =2.5 x 10(-3) per year for 17 million servings of Brie of Meaux and 480 million servings of Camembert of Normandy, respectively.  相似文献   

17.
T. Walton 《Risk analysis》2012,32(7):1122-1138
Through the use of case‐control analyses and quantitative microbial risk assessment (QMRA), relative risks of transmission of cryptosporidiosis have been evaluated (recreational water exposure vs. drinking water consumption) for a Canadian community with higher than national rates of cryptosporidiosis. A QMRA was developed to assess the risk of Cryptosporidium infection through the consumption of municipally treated drinking water. Simulations were based on site‐specific surface water contamination levels and drinking water treatment log10 reduction capacity for Cryptosporidium. Results suggested that the risk of Cryptosporidium infection via drinking water in the study community, assuming routine operation of the water treatment plant, was negligible (6 infections per 1013 persons per day—5th percentile: 2 infections per 1015 persons per day; 95th percentile: 3 infections per 1012 persons per day). The risk is essentially nonexistent during optimized, routine treatment operations. The study community achieves between 7 and 9 log10Cryptosporidium oocyst reduction through routine water treatment processes. Although these results do not preclude the need for constant vigilance by both water treatment and public health professionals in this community, they suggest that the cause of higher rates of cryptosporidiosis are more likely due to recreational water contact, or perhaps direct animal contact. QMRA can be successfully applied at the community level to identify data gaps, rank relative public health risks, and forecast future risk scenarios. It is most useful when performed in a collaborative way with local stakeholders, from beginning to end of the risk analysis paradigm.  相似文献   

18.
In this article, the performance objectives (POs) for Bacillus cereus group (BC) in celery, cheese, and spelt added as ingredients in a ready‐to‐eat mixed spelt salad, packaged under modified atmosphere, were calculated using a Bayesian approach. In order to derive the POs, BC detection and enumeration were performed in nine lots of naturally contaminated ingredients and final product. Moreover, the impact of specific production steps on the BC contamination was quantified. Finally, a sampling plan to verify the ingredient lots' compliance with each PO value at a 95% confidence level (CL) was defined. To calculate the POs, detection results as well as results above the limit of detection but below the limit of quantification (i.e., censored data) were analyzed. The most probable distribution of the censored data was determined and two‐dimensional (2D) Monte Carlo simulations were performed. The PO values were calculated to meet a food safety objective of 4 log10 cfu of BC for g of spelt salad at the time of consumption. When BC grows during storage between 0.90 and 1.90 log10 cfu/g, the POs for BC in celery, cheese, and spelt ranged between 1.21 log10 cfu/g for celery and 2.45 log10 cfu/g for spelt. This article represents the first attempt to manage the concept of PO and 2D Monte Carlo simulation in the flow chart of a complex food matrix, including raw and cooked ingredients.  相似文献   

19.
The improvement of food safety in the domestic environment requires a transdisciplinary approach, involving interaction between both the social and natural sciences. This approach is applied in a study on risks associated with Campylobacter on broiler meat. First, some web-based information interventions were designed and tested on participant motivation and intentions to cook more safely. Based on these self-reported measures, the intervention supported by the emotion "disgust" was selected as the most promising information intervention. Its effect on microbial cross-contamination was tested by recruiting a set of participants who prepared a salad with chicken breast fillet carrying a known amount of tracer bacteria. The amount of tracer that could be recovered from the salad revealed the transfer and survival of Campylobacter and was used as a measure of hygiene. This was introduced into an existing risk model on Campylobacter in the Netherlands to assess the effect of the information intervention both at the level of exposure and the level of human disease risk. We showed that the information intervention supported by the emotion "disgust" alone had no measurable effect on the health risk. However, when a behavioral cue was embedded within the instruction for the salad preparation, the risk decreased sharply. It is shown that a transdisciplinary approach, involving research on risk perception, microbiology, and risk assessment, is successful in evaluating the efficacy of an information intervention in terms of human health risks. The approach offers a novel tool for science-based risk management in the area of food safety.  相似文献   

20.
《Risk analysis》2018,38(8):1738-1757
We developed a risk assessment of human salmonellosis associated with consumption of alfalfa sprouts in the United States to evaluate the public health impact of applying treatments to seeds (0–5‐log10 reduction in Salmonella ) and testing spent irrigation water (SIW) during production. The risk model considered variability and uncertainty in Salmonella contamination in seeds, Salmonella growth and spread during sprout production, sprout consumption, and Salmonella dose response. Based on an estimated prevalence of 2.35% for 6.8 kg seed batches and without interventions, the model predicted 76,600 (95% confidence interval (CI) 15,400 – 248,000) cases/year. Risk reduction (by 5 ‐ to 7‐fold) predicted from a 1‐log10 seed treatment alone was comparable to SIW testing alone, and each additional 1‐log10 seed treatment was predicted to provide a greater risk reduction than SIW testing. A 3‐log10 or a 5‐log10 seed treatment reduced the predicted cases/year to 139 (95% CI 33 – 448) or 1.4 (95% CI <1 – 4.5), respectively. Combined with SIW testing, a 3‐log10 or 5‐log10 seed treatment reduced the cases/year to 45 (95% CI 10–146) or <1 (95% CI <1 – 1.5), respectively. If the SIW coverage was less complete (i.e., less representative), a smaller risk reduction was predicted, e.g., a combined 3‐log10 seed treatment and SIW testing with 20% coverage resulted in an estimated 92 (95% CI 22 – 298) cases/year. Analysis of alternative scenarios using different assumptions for key model inputs showed that the predicted relative risk reductions are robust. This risk assessment provides a comprehensive approach for evaluating the public health impact of various interventions in a sprout production system.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号