首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 296 毫秒
1.
A. Pielaat 《Risk analysis》2011,31(9):1434-1450
A novel purpose of the use of mathematical models in quantitative microbial risk assessment (QMRA) is to identify the sources of microbial contamination in a food chain (i.e., biotracing). In this article we propose a framework for the construction of a biotracing model, eventually to be used in industrial food production chains where discrete numbers of products are processed that may be contaminated by a multitude of sources. The framework consists of steps in which a Monte Carlo model, simulating sequential events in the chain following a modular process risk modeling (MPRM) approach, is converted to a Bayesian belief network (BBN). The resulting model provides a probabilistic quantification of concentrations of a pathogen throughout a production chain. A BBN allows for updating the parameters of the model based on observational data, and global parameter sensitivity analysis is readily performed in a BBN. Moreover, a BBN enables “backward reasoning” when downstream data are available and is therefore a natural framework for answering biotracing questions. The proposed framework is illustrated with a biotracing model of Salmonella in the pork slaughter chain, based on a recently published Monte Carlo simulation model. This model, implemented as a BBN, describes the dynamics of Salmonella in a Dutch slaughterhouse and enables finding the source of contamination of specific carcasses at the end of the chain.  相似文献   

2.
Intentional or accidental releases of contaminants into a water distribution system (WDS) have the potential to cause significant adverse health effects among individuals consuming water from the system. A flexible analysis framework is presented here for estimating the magnitude of such potential effects and is applied using network models for 12 actual WDSs of varying sizes. Upper bounds are developed for the magnitude of adverse effects of contamination events in WDSs and evaluated using results from the 12 systems. These bounds can be applied in cases in which little system‐specific information is available. The combination of a detailed, network‐specific approach and a bounding approach allows consequence assessments to be performed for systems for which varying amounts of information are available and addresses important needs of individual utilities as well as regional or national assessments. The approach used in the analysis framework allows contaminant injections at any or all network nodes and uses models that (1) account for contaminant transport in the systems, including contaminant decay, and (2) provide estimates of ingested contaminant doses for the exposed population. The approach can be easily modified as better transport or exposure models become available. The methods presented here provide the ability to quantify or bound potential adverse effects of contamination events for a wide variety of possible contaminants and WDSs, including systems without a network model.  相似文献   

3.
Attributing foodborne illnesses to food sources is essential to conceive, prioritize, and assess the impact of public health policy measures. The Bayesian microbial subtyping attribution model by Hald et al. is one of the most advanced approaches to attribute sporadic cases; it namely allows taking into account the level of exposure to the sources and the differences between bacterial types and between sources. This step forward requires introducing type and source‐dependent parameters, and generates overparameterization, which was addressed in Hald's paper by setting some parameters to constant values. We question the impact of the choices made for the parameterization (parameters set and values used) on model robustness and propose an alternative parameterization for the Hald model. We illustrate this analysis with the 2005 French data set of non‐typhi Salmonella. Mullner's modified Hald model and a simple deterministic model were used to compare the results and assess the accuracy of the estimates. Setting the parameters for bacterial types specific to a unique source instead of the most frequent one and using data‐based values instead of arbitrary values enhanced the convergence and adequacy of the estimates and led to attribution estimates consistent with the other models’ results. The type and source parameters estimates were also coherent with Mullner's model estimates. The model appeared to be highly sensitive to parameterization. The proposed solution based on specific types and data‐based values improved the robustness of estimates and enabled the use of this highly valuable tool successfully with the French data set.  相似文献   

4.
Probabilistic risk assessments are enjoying increasing popularity as a tool to characterize the health hazards associated with exposure to chemicals in the environment. Because probabilistic analyses provide much more information to the risk manager than standard “point” risk estimates, this approach has generally been heralded as one which could significantly improve the conduct of health risk assessments. The primary obstacles to replacing point estimates with probabilistic techniques include a general lack of familiarity with the approach and a lack of regulatory policy and guidance. This paper discusses some of the advantages and disadvantages of the point estimate vs. probabilistic approach. Three case studies are presented which contrast and compare the results of each. The first addresses the risks associated with household exposure to volatile chemicals in tapwater. The second evaluates airborne dioxin emissions which can enter the food-chain. The third illustrates how to derive health-based cleanup levels for dioxin in soil. It is shown that, based on the results of Monte Carlo analyses of probability density functions (PDFs), the point estimate approach required by most regulatory agencies will nearly always overpredict the risk for the 95th percentile person by a factor of up to 5. When the assessment requires consideration of 10 or more exposure variables, the point estimate approach will often predict risks representative of the 99.9th percentile person rather than the 50th or 95th percentile person. This paper recommends a number of data distributions for various exposure variables that we believe are now sufficiently well understood to be used with confidence in most exposure assessments. A list of exposure variables that may require additional research before adequate data distributions can be developed are also discussed.  相似文献   

5.
Public perceptions of both risks and regulatory costs shape rational regulatory choices. Despite decades of risk perception studies, this article is the first on regulatory cost perceptions. A survey of 744 U.S. residents probed: (1) How knowledgeable are laypeople about regulatory costs incurred to reduce risks? (2) Do laypeople see official estimates of cost and benefit (lives saved) as accurate? (3) (How) do preferences for hypothetical regulations change when mean‐preserving spreads of uncertainty replace certain cost or benefit? and (4) (How) do preferences change when unequal interindividual distributions of hypothetical regulatory costs replace equal distributions? Respondents overestimated costs of regulatory compliance, while assuming agencies underestimate costs. Most assumed agency estimates of benefits are accurate; a third believed both cost and benefit estimates are accurate. Cost and benefit estimates presented without uncertainty were slightly preferred to those surrounded by “narrow uncertainty” (a range of costs or lives entirely within a personally‐calibrated zone without clear acceptance or rejection of tradeoffs). Certain estimates were more preferred than “wide uncertainty” (a range of agency estimates extending beyond these personal bounds, thus posing a gamble between favored and unacceptable tradeoffs), particularly for costs as opposed to benefits (but even for costs a quarter of respondents preferred wide uncertainty to certainty). Agency‐acknowledged uncertainty in general elicited mixed judgments of honesty and trustworthiness. People preferred egalitarian distributions of regulatory costs, despite skewed actual cost distributions, and preferred progressive cost distributions (the rich pay a greater than proportional share) to regressive ones. Efficient and socially responsive regulations require disclosure of much more information about regulatory costs and risks.  相似文献   

6.
Quantitative risk assessments for physical, chemical, biological, occupational, or environmental agents rely on scientific studies to support their conclusions. These studies often include relatively few observations, and, as a result, models used to characterize the risk may include large amounts of uncertainty. The motivation, development, and assessment of new methods for risk assessment is facilitated by the availability of a set of experimental studies that span a range of dose‐response patterns that are observed in practice. We describe construction of such a historical database focusing on quantal data in chemical risk assessment, and we employ this database to develop priors in Bayesian analyses. The database is assembled from a variety of existing toxicological data sources and contains 733 separate quantal dose‐response data sets. As an illustration of the database's use, prior distributions for individual model parameters in Bayesian dose‐response analysis are constructed. Results indicate that including prior information based on curated historical data in quantitative risk assessments may help stabilize eventual point estimates, producing dose‐response functions that are more stable and precisely estimated. These in turn produce potency estimates that share the same benefit. We are confident that quantitative risk analysts will find many other applications and issues to explore using this database.  相似文献   

7.
M. C. Kennedy 《Risk analysis》2011,31(10):1597-1609
Two‐dimensional Monte Carlo simulation is frequently used to implement probabilistic risk models, as it allows for uncertainty and variability to be quantified separately. In many cases, we are interested in the proportion of individuals from a variable population exceeding a critical threshold, together with uncertainty about this proportion. In this article we introduce a new method that can accurately estimate these quantities much more efficiently than conventional algorithms. We also show how those model parameters having the greatest impact on the probabilities of rare events can be quickly identified via this method. The algorithm combines elements from well‐established statistical techniques in extreme value theory and Bayesian analysis of computer models. We demonstrate the practical application of these methods with a simple example, in which the true distributions are known exactly, and also with a more realistic model of microbial contamination of milk with seven parameters. For the latter, sensitivity analysis (SA) is shown to identify the two inputs explaining the majority of variation in distribution tail behavior. In the subsequent prediction of probabilities of large contamination events, similar results are obtained using the new approach taking 43 seconds or the conventional simulation that requires more than 3 days.  相似文献   

8.
The Monte Carlo (MC) simulation approach is traditionally used in food safety risk assessment to study quantitative microbial risk assessment (QMRA) models. When experimental data are available, performing Bayesian inference is a good alternative approach that allows backward calculation in a stochastic QMRA model to update the experts’ knowledge about the microbial dynamics of a given food‐borne pathogen. In this article, we propose a complex example where Bayesian inference is applied to a high‐dimensional second‐order QMRA model. The case study is a farm‐to‐fork QMRA model considering genetic diversity of Bacillus cereus in a cooked, pasteurized, and chilled courgette purée. Experimental data are Bacillus cereus concentrations measured in packages of courgette purées stored at different time‐temperature profiles after pasteurization. To perform a Bayesian inference, we first built an augmented Bayesian network by linking a second‐order QMRA model to the available contamination data. We then ran a Markov chain Monte Carlo (MCMC) algorithm to update all the unknown concentrations and unknown quantities of the augmented model. About 25% of the prior beliefs are strongly updated, leading to a reduction in uncertainty. Some updates interestingly question the QMRA model.  相似文献   

9.
Potential climate‐change‐related impacts to agriculture in the upper Midwest pose serious economic and ecological risks to the U.S. and the global economy. On a local level, farmers are at the forefront of responding to the impacts of climate change. Hence, it is important to understand how farmers and their farm operations may be more or less vulnerable to changes in the climate. A vulnerability index is a tool commonly used by researchers and practitioners to represent the geographical distribution of vulnerability in response to global change. Most vulnerability assessments measure objective adaptive capacity using secondary data collected by governmental agencies. However, other scholarship on human behavior has noted that sociocultural and cognitive factors, such as risk perceptions and perceived capacity, are consequential for modulating people's actual vulnerability. Thus, traditional assessments can potentially overlook people's subjective perceptions of changes in climate and extreme weather events and the extent to which people feel prepared to take necessary steps to cope with and respond to the negative effects of climate change. This article addresses this knowledge gap by: (1) incorporating perceived adaptive capacity into a vulnerability assessment; (2) using spatial smoothing to aggregate individual‐level vulnerabilities to the county level; and (3) evaluating the relationships among different dimensions of adaptive capacity to examine whether perceived capacity should be integrated into vulnerability assessments. The result suggests that vulnerability assessments that rely only on objective measures might miss important sociocognitive dimensions of capacity. Vulnerability indices and maps presented in this article can inform engagement strategies for improving environmental sustainability in the region.  相似文献   

10.
Root cause analysis can be used in foodborne illness outbreak investigations to determine the underlying causes of an outbreak and to help identify actions that could be taken to prevent future outbreaks. We developed a new tool, the Quantitative Risk Assessment-Epidemic Curve Prediction Model (QRA-EC), to assist with these goals and applied it to a case study to investigate and illustrate the utility of leveraging quantitative risk assessment to provide unique insights for foodborne illness outbreak root cause analysis. We used a 2019 Salmonella outbreak linked to melons as a case study to demonstrate the utility of this model (Centers for Disease Control and Prevention [CDC], 2019). The model was used to evaluate the impact of various root cause hypotheses (representing different contamination sources and food safety system failures in the melon supply chain) on the predicted number and timeline of illnesses. The predicted number of illnesses varied by contamination source and was strongly impacted by the prevalence and level of Salmonella contamination on the surface/inside of whole melons and inside contamination niches on equipment surfaces. The timeline of illnesses was most strongly impacted by equipment sanitation efficacy for contamination niches. Evaluations of a wide range of scenarios representing various potential root causes enabled us to identify which hypotheses, were likely to result in an outbreak of similar size and illness timeline to the 2019 Salmonella melon outbreak. The QRA-EC framework can be adapted to accommodate any food–pathogen pairs to provide insights for foodborne outbreak investigations.  相似文献   

11.
Ali Mosleh 《Risk analysis》2012,32(11):1888-1900
Credit risk is the potential exposure of a creditor to an obligor's failure or refusal to repay the debt in principal or interest. The potential of exposure is measured in terms of probability of default. Many models have been developed to estimate credit risk, with rating agencies dating back to the 19th century. They provide their assessment of probability of default and transition probabilities of various firms in their annual reports. Regulatory capital requirements for credit risk outlined by the Basel Committee on Banking Supervision have made it essential for banks and financial institutions to develop sophisticated models in an attempt to measure credit risk with higher accuracy. The Bayesian framework proposed in this article uses the techniques developed in physical sciences and engineering for dealing with model uncertainty and expert accuracy to obtain improved estimates of credit risk and associated uncertainties. The approach uses estimates from one or more rating agencies and incorporates their historical accuracy (past performance data) in estimating future default risk and transition probabilities. Several examples demonstrate that the proposed methodology can assess default probability with accuracy exceeding the estimations of all the individual models. Moreover, the methodology accounts for potentially significant departures from “nominal predictions” due to “upsetting events” such as the 2008 global banking crisis.  相似文献   

12.
This study develops a comprehensive framework to optimize new product introduction timing and subsequent production decisions faced by a component supplier. Prior to market entry, the supplier performs process design activities, which improve manufacturing yield and the chances of getting qualified for the customer's product. However, a long delay in market entry allows competitors to enter the market and pass the customer's qualification process before the supplier, reducing the supplier's share of the customer's business. After entering the market and if qualified, the supplier also needs to decide how much to produce for a finite planning horizon by considering several factors such as manufacturing yield and stochastic demand, both of which depend on the earlier time‐to‐market decision. To capture this dependency, we develop a sequential, nested, two‐stage decision framework to optimize the time‐to‐market and production decisions in relation to each other. We show that the supplier's optimal market entry and qualification timing decision need to be revised in real time based on the number of qualified competitors at the time of market‐entry decision. We establish the optimality of a threshold policy. Following this policy, at the beginning of each decision epoch, the supplier should optimally stop preparing for qualification and decide whether to enter the market if her order among qualified competitors exceeds a predetermined threshold. We also prove that the supplier's optimal production policy is a state‐dependent, base‐stock policy, which depends on the time‐to‐market and qualification decisions. The proposed framework also enables a firm to quantify how market conditions (such as price and competitor entry behavior) and operating conditions (such as the rate of learning and inventory/production‐related costs) affect time‐to‐market strategy and post‐entry production decisions.  相似文献   

13.
Richard Genovesi 《Risk analysis》2012,32(12):2182-2197
Drinking water supplies are at risk of contamination from a variety of physical, chemical, and biological sources. Ranked among these threats are hazardous material releases from leaking or improperly managed underground storage tanks located at municipal, commercial, and industrial facilities. To reduce human health and environmental risks associated with the subsurface storage of hazardous materials, government agencies have taken a variety of legislative and regulatory actions—which date back more than 25 years and include the establishment of rigorous equipment/technology/operational requirements and facility‐by‐facility inspection and enforcement programs. Given a history of more than 470,000 underground storage tank releases nationwide, the U.S. Environmental Protection Agency continues to report that 7,300 new leaks were found in federal fiscal year 2008, while nearly 103,000 old leaks remain to be cleaned up. In this article, we report on an alternate evidence‐based intervention approach for reducing potential releases from the storage of petroleum products (gasoline, diesel, kerosene, heating/fuel oil, and waste oil) in underground tanks at commercial facilities located in Rhode Island. The objective of this study was to evaluate whether a new regulatory model can be used as a cost‐effective alternative to traditional facility‐by‐facility inspection and enforcement programs for underground storage tanks. We conclude that the alternative model, using an emphasis on technical assistance tools, can produce measurable improvements in compliance performance, is a cost‐effective adjunct to traditional facility‐by‐facility inspection and enforcement programs, and has the potential to allow regulatory agencies to decrease their frequency of inspections among low risk facilities without sacrificing compliance performance or increasing public health risks.  相似文献   

14.
Listeria monocytogenes is among the foodborne pathogens with the highest death toll in the United States. Ready‐to‐eat foods contaminated at retail are an important source of infection. Environmental sites in retail deli operations can be contaminated. However, commonly contaminated sites are unlikely to come into direct contact with food and the public health relevance of environmental contamination has remained unclear. To identify environmental sites that may pose a considerable cross‐contamination risk, to elucidate potential transmission pathways, and to identify knowledge gaps, we performed a structured expert elicitation of 41 experts from state regulatory agencies and the food retail industry with practical experience in retail deli operations. Following the “Delphi” method, the elicitation was performed in three consecutive steps: questionnaire, review and discussion of results, second questionnaire. Hands and gloves were identified as important potential contamination sources. However, bacterial transfers to and from hands or gloves represented a major data gap. Experts agreed about transfer probabilities from cutting boards, scales, deli cases, and deli preparation sinks to product, and about transfer probabilities from floor drains, walk‐in cooler floors, and knife racks to food contact surfaces. Comparison of experts' opinions to observational data revealed a tendency among experts with certain demographic characteristics and professional opinions to overestimate prevalence. Experts’ votes clearly clustered into separate groups not defined by place of employment, even though industry experts may have been somewhat overrepresented in one cluster. Overall, our study demonstrates the value and caveats of expert elicitation to identify data gaps and prioritize research efforts.  相似文献   

15.
We used an agent‐based modeling (ABM) framework and developed a mathematical model to explain the complex dynamics of microbial persistence and spread within a food facility and to aid risk managers in identifying effective mitigation options. The model explicitly considered personal hygiene practices by food handlers as well as their activities and simulated a spatially explicit dynamic system representing complex interaction patterns among food handlers, facility environment, and foods. To demonstrate the utility of the model in a decision‐making context, we created a hypothetical case study and used it to compare different risk mitigation strategies for reducing contamination and spread of Listeria monocytogenes in a food facility. Model results indicated that areas with no direct contact with foods (e.g., loading dock and restroom) can serve as contamination niches and recontaminate areas that have direct contact with food products. Furthermore, food handlers’ behaviors, including, for example, hygiene and sanitation practices, can impact the persistence of microbial contamination in the facility environment and the spread of contamination to prepared foods. Using this case study, we also demonstrated benefits of an ABM framework for addressing food safety in a complex system in which emergent system‐level responses are predicted using a bottom‐up approach that observes individual agents (e.g., food handlers) and their behaviors. Our model can be applied to a wide variety of pathogens, food commodities, and activity patterns to evaluate efficacy of food‐safety management practices and quantify contamination reductions associated with proposed mitigation strategies in food facilities.  相似文献   

16.
Food safety objectives (FSOs) are established in order to minimize the risk of foodborne illnesses to consumers, but these have not yet been incorporated into regulatory policy. An FSO states the maximum frequency and/or concentration of a microbiological hazard in a food at the time of consumption that provides an acceptable level of protection to the public and leads to a performance criterion for industry. However, in order to be implemented as a regulation, this criterion has to be achievable by the affected industry. In order to determine an FSO, the steps to produce and store that food need to be known, especially where they have an impact on contamination, growth, and destruction. This article uses existing models for growth of Listeria monocytogenes in conjunction with calculations of FSOs to approximate the outcome of more than one introduction of the foodborne organism throughout the food-processing path from the farm to the consumer. Most models for the growth and reduction of foodborne illnesses are logarithmic in nature, which fits the nature of the growth of microorganisms, spanning many orders of magnitude. However, these logarithmic models are normally limited to a single introduction step and a single reduction step. The model presented as part of this research addresses more than one introduction of food contamination, each of which can be separated by a substantial amount of time. The advantage of treating the problem this way is the accommodation of multiple introductions of foodborne pathogens over a range of time durations and conditions.  相似文献   

17.
Bob Maaskant 《Risk analysis》2011,31(2):282-300
The Dutch government is in the process of revising its flood safety policy. The current safety standards for flood defenses in the Netherlands are largely based on the outcomes of cost‐benefit analyses. Loss of life has not been considered separately in the choice for current standards. This article presents the results of a research project that evaluated the potential roles of two risk metrics, individual and societal risk, to support decision making about new flood safety standards. These risk metrics are already used in the Dutch major hazards policy for the evaluation of risks to the public. Individual risk concerns the annual probability of death of a person. Societal risk concerns the probability of an event with many fatalities. Technical aspects of the use of individual and societal risk metrics in flood risk assessments as well as policy implications are discussed. Preliminary estimates of nationwide levels of societal risk are presented. Societal risk levels appear relatively high in the southwestern part of the country where densely populated dike rings are threatened by a combination of river and coastal floods. It was found that cumulation, the simultaneous flooding of multiple dike rings during a single flood event, has significant impact on the national level of societal risk. Options for the application of the individual and societal risk in the new flood safety policy are presented and discussed.  相似文献   

18.
消费扩大驱动下食品质量安全问题日渐严峻,食品企业经济利益驱动型掺假所具有的蓄意性、隐蔽性、技术性特点,为新时期食品安全治理提出了一系列挑战。鉴于此,本文借助演化博弈理论,考虑了食品企业收益、食品生产技术成本、生产残次食品损失以及政府监管成本、社会负面效益等因素对食品安全风险形成的影响,构建了食品掺假行为演化博弈模型,并对其演化状态进行了理论和仿真分析。在此基础上,运用元胞自动机理论,进一步考虑到食品企业策略转变意愿与基层食品监管机构策略转变意愿,从空间博弈角度对食品掺假行为及其监管的空间演化状态进行了深度剖析和刻画。研究结果显示,食品企业和基层食品监管机构在策略选择方面具有同步性振荡特征。而且,在食品企业策略转变意愿或基层食品监管机构策略转变意愿维持较低水平时,食品企业和基层食品监管机构的策略(严格监管,合规生产)是纯策略稳定状态。通过本文研究既丰富了我国食品安全监管理论,也为地方食品安全长效监管提供了思路借鉴和理论指导。  相似文献   

19.
According to Codex Alimentarius Commission recommendations, management options applied at the process production level should be based on good hygiene practices, HACCP system, and new risk management metrics such as the food safety objective. To follow this last recommendation, the use of quantitative microbiological risk assessment is an appealing approach to link new risk‐based metrics to management options that may be applied by food operators. Through a specific case study, Listeria monocytogenes in soft cheese made from pasteurized milk, the objective of the present article is to practically show how quantitative risk assessment could be used to direct potential intervention strategies at different food processing steps. Based on many assumptions, the model developed estimates the risk of listeriosis at the moment of consumption taking into account the entire manufacturing process and potential sources of contamination. From pasteurization to consumption, the amplification of a primo‐contamination event of the milk, the fresh cheese or the process environment is simulated, over time, space, and between products, accounting for the impact of management options, such as hygienic operations and sampling plans. A sensitivity analysis of the model will help orientating data to be collected prioritarily for the improvement and the validation of the model. What‐if scenarios were simulated and allowed for the identification of major parameters contributing to the risk of listeriosis and the optimization of preventive and corrective measures.  相似文献   

20.
In Science and Decisions: Advancing Risk Assessment, the National Research Council recommends improvements in the U.S. Environmental Protection Agency's approach to risk assessment. The recommendations aim to increase the utility of these assessments, embedding them within a new risk‐based decision‐making framework. The framework involves first identifying the problem and possible options for addressing it, conducting related analyses, then reviewing the results and making the risk management decision. Experience with longstanding requirements for regulatory impact analysis provides insights into the implementation of this framework. First, neither the Science and Decisions framework nor the framework for regulatory impact analysis should be viewed as a static or linear process, where each step is completed before moving on to the next. Risk management options are best evaluated through an iterative and integrative procedure. The extent to which a hazard has been previously studied will strongly influence analysts’ ability to identify options prior to conducting formal analyses, and these options will be altered and refined as the analysis progresses. Second, experience with regulatory impact analysis suggests that legal and political constraints may limit the range of options assessed, contrary to both existing guidance for regulatory impact analysis and the Science and Decisions recommendations. Analysts will need to work creatively to broaden the range of options considered. Finally, the usefulness of regulatory impact analysis has been significantly hampered by the inability to quantify many health impacts of concern, suggesting that the scientific improvements offered within Science and Decisions will fill an crucial research gap.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号