首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 547 毫秒
1.
The economically optimal sample size in a food safety test balances the marginal costs and marginal benefits of increasing the sample size. We provide a method for selecting the sample size when testing beef trim for Escherichia coli O157:H7 that equates the averted costs of recalls and health damages from contaminated meats sold to consumers with the increased costs of testing while allowing for uncertainty about the underlying prevalence rates of contamination. Using simulations, we show that, in most cases, the optimal sample size is larger than the current sample size of 60 and, in some cases, it exceeds 120. Moreover, lots with a lower prevalence rate have a higher expected damage because contamination is more difficult to detect. Our simulations indicate that these lots have a higher optimal sampling rate.  相似文献   

2.
We analyze the risk of contracting illness due to the consumption in the United States of hamburgers contaminated with enterohemorrhagic Escherichia coli (EHEC) of serogroup O157 produced from manufacturing beef imported from Australia. We have used a novel approach for estimating risk by using the prevalence and concentration estimates of E. coli O157 in lots of beef that were withdrawn from the export chain following detection of the pathogen. For the purpose of the present assessment an assumption was that no product is removed from the supply chain following testing. This, together with a number of additional conservative assumptions, leads to an overestimation of E. coli O157‐associated illness attributable to the consumption of ground beef patties manufactured only from Australian beef. We predict 49.6 illnesses (95%: 0.0–148.6) from the 2.46 billion hamburgers made from 155,000 t of Australian manufacturing beef exported to the United States in 2012. All these illness were due to undercooking in the home and less than one illness is predicted from consumption of hamburgers cooked to a temperature of 68 °C in quick‐service restaurants.  相似文献   

3.
Much of the literature regarding food safety sampling plans implicitly assumes that all lots entering commerce are tested. In practice, however, only a fraction of lots may be tested due to a budget constraint. In such a case, there is a tradeoff between the number of lots tested and the number of samples per lot. To illustrate this tradeoff, a simple model is presented in which the optimal number of samples per lot depends on the prevalence of sample units that do not conform to microbiological specifications and the relative costs of sampling a lot and of drawing and testing a sample unit from a lot. The assumed objective is to maximize the number of nonconforming lots that are rejected subject to a food safety sampling budget constraint. If the ratio of the cost per lot to the cost per sample unit is substantial, the optimal number of samples per lot increases as prevalence decreases. However, if the ratio of the cost per lot to the cost per sample unit is sufficiently small, the optimal number of samples per lot reduces to one (i.e., simple random sampling), regardless of prevalence. In practice, the cost per sample unit may be large relative to the cost per lot due to the expense of laboratory testing and other factors. Designing effective compliance assurance measures depends on economic, legal, and other factors in addition to microbiology and statistics.  相似文献   

4.
We develop a prioritization framework for foodborne risks that considers public health impact as well as three other factors (market impact, consumer risk acceptance and perception, and social sensitivity). Canadian case studies are presented for six pathogen‐food combinations: Campylobacter spp. in chicken; Salmonella spp. in chicken and spinach; Escherichia coli O157 in spinach and beef; and Listeria monocytogenes in ready‐to‐eat meats. Public health impact is measured by disability‐adjusted life years and the cost of illness. Market impact is quantified by the economic importance of the domestic market. Likert‐type scales are used to capture consumer perception and acceptance of risk and social sensitivity to impacts on vulnerable consumer groups and industries. Risk ranking is facilitated through the development of a knowledge database presented in the format of info cards and the use of multicriteria decision analysis (MCDA) to aggregate the four factors. Three scenarios representing different stakeholders illustrate the use of MCDA to arrive at rankings of pathogen‐food combinations that reflect different criteria weights. The framework provides a flexible instrument to support policymakers in complex risk prioritization decision making when different stakeholder groups are involved and when multiple pathogen‐food combinations are compared.  相似文献   

5.
This study aimed at developing a predictive model that captures the influences of a variety of agricultural and environmental variables and is able to predict the concentrations of enteric bacteria in soil amended with untreated Biological Soil Amendments of Animal Origin (BSAAO) under dynamic conditions. We developed and validated a Random Forest model using data from a longitudinal field study conducted in mid-Atlantic United States investigating the survival of Escherichia coli O157:H7 and generic E. coli in soils amended with untreated dairy manure, horse manure, or poultry litter. Amendment type, days of rain since the previous sampling day, and soil moisture content were identified as the most influential agricultural and environmental variables impacting concentrations of viable E. coli O157:H7 and generic E. coli recovered from amended soils. Our model results also indicated that E. coli O157:H7 and generic E. coli declined at similar rates in amended soils under dynamic field conditions.The Random Forest model accurately predicted changes in viable E. coli concentrations over time under different agricultural and environmental conditions. Our model also accurately characterized the variability of E. coli concentration in amended soil over time by providing upper and lower prediction bound estimates. Cross-validation results indicated that our model can be potentially generalized to other geographic regions and incorporated into a risk assessment for evaluating the risks associated with application of untreated BSAAO. Our model can be validated for other regions and predictive performance also can be enhanced when data sets from additional geographic regions become available.  相似文献   

6.
Food safety monitoring faces the challenge of tackling multiple chemicals along the various stages of the food supply chain. Our study developed a methodology for optimizing sampling for monitoring multiple chemicals along the dairy supply chain. We used a mixed integer nonlinear programming approach to maximize the performance of the sampling in terms of reducing the risk of the potential disability adjusted life years (DALYs) in the population. Decision variables are the number of samples collected and analyzed at each stage of the food chain (feed mills, dairy farms, milk trucks, and dairy processing plants) for each chemical, given a predefined budget. The model was applied to the case of monitoring for aflatoxin B1/M1(AFB1/M1) and dioxins in a hypothetical Dutch dairy supply chain, and results were calculated for various contamination scenarios defined in terms of contamination fraction and concentrations. Considering various monitoring budgets for both chemicals, monitoring for AFB1/M1 showed to be more effective than for dioxins in most of the considered scenarios, because AFB1/M1 could result into more DALYs than dioxins when both chemicals are in same contamination fraction, and costs for analyzing one AFB1/M1 sample are lower than for one dioxins sample. The results suggest that relatively more resources be spent on monitoring AFB1/M1 when both chemicals’ contamination fractions are low; when both contamination fractions are higher, relatively more budget should be addressed to monitoring dioxins.  相似文献   

7.
The objective of this study was to leverage quantitative risk assessment to investigate possible root cause(s) of foodborne illness outbreaks related to Shiga toxin-producing Escherichia coli O157:H7 (STEC O157) infections in leafy greens in the United States. To this end, we developed the FDA leafy green quantitative risk assessment epidemic curve prediction model (FDA-LG QRA-EC) that simulated the lettuce supply chain. The model was used to predict the number of reported illnesses and the epidemic curve associated with lettuce contaminated with STEC O157 for a wide range of scenarios representing various contamination conditions and facility processing/sanitation practices. Model predictions were generated for fresh-cut and whole lettuce, quantifying the differing impacts of facility processing and home preparation on predicted illnesses. Our model revealed that the timespan (i.e., number of days with at least one reported illness) and the peak (i.e., day with the most predicted number of reported illnesses) of the epidemic curve of a STEC O157-lettuce outbreak were not strongly influenced by facility processing/sanitation practices and were indications of contamination pattern among incoming lettuce batches received by the facility or distribution center. Through comparisons with observed number of illnesses from recent STEC O157-lettuce outbreaks, the model identified contamination conditions on incoming lettuce heads that could result in an outbreak of similar size, which can be used to narrow down potential root cause hypotheses.  相似文献   

8.
To address the risk posed to human health by the consumption of VTEC O157 within contaminated pork, lamb, and beef products within Great Britain, a quantitative risk assessment model has been developed. This model aims to simulate the prevalence and amount of VTEC O157 in different meat products at consumption within a single model framework by adapting previously developed models. The model is stochastic in nature, enabling both variability (natural variation between animals, carcasses, products) and uncertainty (lack of knowledge) about the input parameters to be modeled. Based on the model assumptions and data, it is concluded that the prevalence of VTEC O157 in meat products (joints and mince) at consumption is low (i.e., <0.04%). Beef products, particularly beef burgers, present the highest estimated risk with an estimated eight out of 100,000 servings on average resulting in human infection with VTEC O157.  相似文献   

9.
Thomas Oscar 《Risk analysis》2021,41(1):110-130
Salmonella is a leading cause of foodborne illness (i.e., salmonellosis) outbreaks, which on occasion are attributed to ground turkey. The poultry industry uses Salmonella prevalence as an indicator of food safety. However, Salmonella prevalence is only one of several factors that determine risk of salmonellosis. Consequently, a model for predicting risk of salmonellosis from individual lots of ground turkey as a function of Salmonella prevalence and other risk factors was developed. Data for Salmonella contamination (prevalence, number, and serotype) of ground turkey were collected at meal preparation. Scenario analysis was used to evaluate effects of model variables on risk of salmonellosis. Epidemiological data were used to simulate Salmonella serotype virulence in a dose‐response model that was based on human outbreak and feeding trial data. Salmonella prevalence was 26% (n = 100) per 25 g of ground turkey, whereas Salmonella number ranged from 0 to 1.603 with a median of 0.185 log per 25 g. Risk of salmonellosis (total arbitrary units (AU) per lot) was affected (p ≤ 0.05) by Salmonella prevalence, number, and virulence, by incidence and extent of undercooking, and by food consumption behavior and host resistance but was not (p > 0.05) affected by serving size, serving size distribution, or total bacterial load of ground turkey when all other risk factors were held constant. When other risk factors were not held constant, Salmonella prevalence was not correlated (r = ?0.39; p = 0.21) with risk of salmonellosis. Thus, Salmonella prevalence alone was not a good indicator of poultry food safety because other factors were found to alter risk of salmonellosis. In conclusion, a more holistic approach to poultry food safety, such as the process risk model developed in the present study, is needed to better protect public health from foodborne pathogens like Salmonella.  相似文献   

10.
Root cause analysis can be used in foodborne illness outbreak investigations to determine the underlying causes of an outbreak and to help identify actions that could be taken to prevent future outbreaks. We developed a new tool, the Quantitative Risk Assessment-Epidemic Curve Prediction Model (QRA-EC), to assist with these goals and applied it to a case study to investigate and illustrate the utility of leveraging quantitative risk assessment to provide unique insights for foodborne illness outbreak root cause analysis. We used a 2019 Salmonella outbreak linked to melons as a case study to demonstrate the utility of this model (Centers for Disease Control and Prevention [CDC], 2019). The model was used to evaluate the impact of various root cause hypotheses (representing different contamination sources and food safety system failures in the melon supply chain) on the predicted number and timeline of illnesses. The predicted number of illnesses varied by contamination source and was strongly impacted by the prevalence and level of Salmonella contamination on the surface/inside of whole melons and inside contamination niches on equipment surfaces. The timeline of illnesses was most strongly impacted by equipment sanitation efficacy for contamination niches. Evaluations of a wide range of scenarios representing various potential root causes enabled us to identify which hypotheses, were likely to result in an outbreak of similar size and illness timeline to the 2019 Salmonella melon outbreak. The QRA-EC framework can be adapted to accommodate any food–pathogen pairs to provide insights for foodborne outbreak investigations.  相似文献   

11.
This paper develops an explicit relationship between sample size, sampling error, and related costs for the application of multiple regression models in observational studies. Graphs and formulas for determining optimal sample sizes and related factors are provided to facilitate the application of the derived models. These graphs reveal that, in most cases, the imprecision of estimates and minimum total cost are relatively insensitive to increases in sample size beyond n=20. Because of the intrinsic variation of the regression model, even if larger samples are optimal, the relative change in the total cost function is small when the cost of imprecision is a quadratic function. A model-utility approach, however, may impose a lower bound on sample size that requires the sample size be larger than indicated by the estimation or cost-minimization approaches. Graphs are provided to illustrate lower-bound conditions on sample size. Optimal sample size in view of all considerations is obtained by the maximin criterion, the maximum of the minimum sample size for all approaches.  相似文献   

12.
According to Codex Alimentarius Commission recommendations, management options applied at the process production level should be based on good hygiene practices, HACCP system, and new risk management metrics such as the food safety objective. To follow this last recommendation, the use of quantitative microbiological risk assessment is an appealing approach to link new risk‐based metrics to management options that may be applied by food operators. Through a specific case study, Listeria monocytogenes in soft cheese made from pasteurized milk, the objective of the present article is to practically show how quantitative risk assessment could be used to direct potential intervention strategies at different food processing steps. Based on many assumptions, the model developed estimates the risk of listeriosis at the moment of consumption taking into account the entire manufacturing process and potential sources of contamination. From pasteurization to consumption, the amplification of a primo‐contamination event of the milk, the fresh cheese or the process environment is simulated, over time, space, and between products, accounting for the impact of management options, such as hygienic operations and sampling plans. A sensitivity analysis of the model will help orientating data to be collected prioritarily for the improvement and the validation of the model. What‐if scenarios were simulated and allowed for the identification of major parameters contributing to the risk of listeriosis and the optimization of preventive and corrective measures.  相似文献   

13.
《Risk analysis》2018,38(8):1718-1737
We developed a probabilistic mathematical model for the postharvest processing of leafy greens focusing on Escherichia coli O157:H7 contamination of fresh‐cut romaine lettuce as the case study. Our model can (i) support the investigation of cross‐contamination scenarios, and (ii) evaluate and compare different risk mitigation options. We used an agent‐based modeling framework to predict the pathogen prevalence and levels in bags of fresh‐cut lettuce and quantify spread of E. coli O157:H7 from contaminated lettuce to surface areas of processing equipment. Using an unbalanced factorial design, we were able to propagate combinations of random values assigned to model inputs through different processing steps and ranked statistically significant inputs with respect to their impacts on selected model outputs. Results indicated that whether contamination originated on incoming lettuce heads or on the surface areas of processing equipment, pathogen prevalence among bags of fresh‐cut lettuce and batches was most significantly impacted by the level of free chlorine in the flume tank and frequency of replacing the wash water inside the tank. Pathogen levels in bags of fresh‐cut lettuce were most significantly influenced by the initial levels of contamination on incoming lettuce heads or surface areas of processing equipment. The influence of surface contamination on pathogen prevalence or levels in fresh‐cut bags depended on the location of that surface relative to the flume tank. This study demonstrates that developing a flexible yet mathematically rigorous modeling tool, a “virtual laboratory,” can provide valuable insights into the effectiveness of individual and combined risk mitigation options.  相似文献   

14.
The objective of this study is to identify a procedure for determining sample size allocation for food radiation inspections of more than one food item to minimize the potential risk to consumers of internal radiation exposure. We consider a simplified case of food radiation monitoring and safety inspection in which a risk manager is required to monitor two food items, milk and spinach, in a contaminated area. Three protocols for food radiation monitoring with different sample size allocations were assessed by simulating random sampling and inspections of milk and spinach in a conceptual monitoring site. Distributions of 131I and radiocesium concentrations were determined in reference to 131I and radiocesium concentrations detected in Fukushima prefecture, Japan, for March and April 2011. The results of the simulations suggested that a protocol that allocates sample size to milk and spinach based on the estimation of 131I and radiocesium concentrations using the apparent decay rate constants sequentially calculated from past monitoring data can most effectively minimize the potential risks of internal radiation exposure.  相似文献   

15.
In quantitative microbiological risk assessment (QMRA), the consumer phase model (CPM) describes the part of the food chain between purchase of the food product at retail and exposure. Construction of a CPM is complicated by the large variation in consumer food handling practices and a limited availability of data. Therefore, several subjective (simplifying) assumptions have to be made when a CPM is constructed, but with a single CPM their impact on the QMRA results is unclear. We therefore compared the performance of eight published CPMs for Campylobacter in broiler meat in an example of a QMRA, where all the CPMs were analyzed using one single input distribution of concentrations at retail, and the same dose‐response relationship. It was found that, between CPMs, there may be a considerable difference in the estimated probability of illness per serving. However, the estimated relative risk reductions are less different for scenarios modeling the implementation of control measures. For control measures affecting the Campylobacter prevalence, the relative risk is proportional irrespective of the CPM used. However, for control measures affecting the concentration the CPMs show some difference in the estimated relative risk. This difference is largest for scenarios where the aim is to remove the highly contaminated portion from human exposure. Given these results, we conclude that for many purposes it is not necessary to develop a new detailed CPM for each new QMRA. However, more observational data on consumer food handling practices and their impact on microbial transfer and survival are needed to generalize this conclusion.  相似文献   

16.
A model of a production process, using an unscheduled set-up policy and utilizing fraction-defective control charts to control current production is developed taking into consideration all the costs; namely cost of sampling, cost of not detecting a change in the process, cost of a false indication of change, and the cost of re-adjusting detected changes. The model is based on the concept of the expected time between detection of changes calling for set-ups. It is shown that the combination of unscheduled set-ups and control charts can be utilized in an optimal way if those combinations of sample size, sampling interval and extent of control limits from process average will be used that provide the minimum expected total cost per unit of time. The costs when a production process with unscheduled set-up is controlled by using the appropriate optimal control charts is compared to the cost of a production process using scheduled set-ups at optimum intervals in conjunction with its appropriate control charts. This comparison indicates the criteria for selecting production processes with scheduled set-ups using optimal set-up intervals over unscheduled set-ups. Suggestions are made to evaluate the optimal process set-up strategy and the accompanying decision parameters, for any specific cost data, by use of computer enumeration.  相似文献   

17.
Shiga‐toxin producing Escherichia coli (STEC) strains may cause human infections ranging from simple diarrhea to Haemolytic Uremic Syndrome (HUS). The five main pathogenic serotypes of STEC (MPS‐STEC) identified thus far in Europe are O157:H7, O26:H11, O103:H2, O111:H8, and O145:H28. Because STEC strains can survive or grow during cheese making, particularly in soft cheeses, a stochastic quantitative microbial risk assessment model was developed to assess the risk of HUS associated with the five MPS‐STEC in raw milk soft cheeses. A baseline scenario represents a theoretical worst‐case scenario where no intervention was considered throughout the farm‐to‐fork continuum. The risk level assessed with this baseline scenario is the risk‐based level. The impact of seven preharvest scenarios (vaccines, probiotic, milk farm sorting) on the risk‐based level was expressed in terms of risk reduction. Impact of the preharvest intervention ranges from 76% to 98% of risk reduction with highest values predicted with scenarios combining a decrease of the number of cow shedding STEC and of the STEC concentration in feces. The impact of postharvest interventions on the risk‐based level was also tested by applying five microbiological criteria (MC) at the end of ripening. The five MCs differ in terms of sample size, the number of samples that may yield a value larger than the microbiological limit, and the analysis methods. The risk reduction predicted varies from 25% to 96% by applying MCs without preharvest interventions and from 1% to 96% with combination of pre‐ and postharvest interventions.  相似文献   

18.
This article demonstrates application of sensitivity analysis to risk assessment models with two-dimensional probabilistic frameworks that distinguish between variability and uncertainty. A microbial food safety process risk (MFSPR) model is used as a test bed. The process of identifying key controllable inputs and key sources of uncertainty using sensitivity analysis is challenged by typical characteristics of MFSPR models such as nonlinearity, thresholds, interactions, and categorical inputs. Among many available sensitivity analysis methods, analysis of variance (ANOVA) is evaluated in comparison to commonly used methods based on correlation coefficients. In a two-dimensional risk model, the identification of key controllable inputs that can be prioritized with respect to risk management is confounded by uncertainty. However, as shown here, ANOVA provided robust insights regarding controllable inputs most likely to lead to effective risk reduction despite uncertainty. ANOVA appropriately selected the top six important inputs, while correlation-based methods provided misleading insights. Bootstrap simulation is used to quantify uncertainty in ranks of inputs due to sampling error. For the selected sample size, differences in F values of 60% or more were associated with clear differences in rank order between inputs. Sensitivity analysis results identified inputs related to the storage of ground beef servings at home as the most important. Risk management recommendations are suggested in the form of a consumer advisory for better handling and storage practices.  相似文献   

19.
Increasing evidence suggests that persistence of Listeria monocytogenes in food processing plants has been the underlying cause of a number of human listeriosis outbreaks. This study extracts criteria used by food safety experts in determining bacterial persistence in the environment, using retail delicatessen operations as a model. Using the Delphi method, we conducted an expert elicitation with 10 food safety experts from academia, industry, and government to classify L. monocytogenes persistence based on environmental sampling results collected over six months for 30 retail delicatessen stores. The results were modeled using variations of random forest, support vector machine, logistic regression, and linear regression; variable importance values of random forest and support vector machine models were consolidated to rank important variables in the experts’ classifications. The duration of subtype isolation ranked most important across all expert categories. Sampling site category also ranked high in importance and validation errors doubled when this covariate was removed. Support vector machine and random forest models successfully classified the data with average validation errors of 3.1% and 2.2% (n = 144), respectively. Our findings indicate that (i) the frequency of isolations over time and sampling site information are critical factors for experts determining subtype persistence, (ii) food safety experts from different sectors may not use the same criteria in determining persistence, and (iii) machine learning models have potential for future use in environmental surveillance and risk management programs. Future work is necessary to validate the accuracy of expert and machine classification against biological measurement of L. monocytogenes persistence.  相似文献   

20.
The management of microbial risk in food products requires the ability to predict growth kinetics of pathogenic microorganisms in the event of contamination and growth initiation. Useful data for assessing these issues may be found in the literature or from experimental results. However, the large number and variety of data make further development difficult. Statistical techniques, such as meta-analysis, are then useful to realize synthesis of a set of distinct but similar experiences. Moreover, predictive modeling tools can be employed to complete the analysis and help the food safety manager to interpret the data. In this article, a protocol to perform a meta-analysis of the outcome of a relational database, associated with quantitative microbiology models, is presented. The methodology is illustrated with the effect of temperature on pathogenic Escherichia coli and Listeria monocytogenes, growing in culture medium, beef meat, and milk products. Using a database and predictive models, simulations of growth in a given product subjected to various temperature scenarios can be produced. It is then possible to compare food products for a given microorganism, according to its growth ability in these products, and to compare the behavior of bacteria in a given foodstuff. These results can assist decisions for a variety of questions on food safety.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号