首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到12条相似文献,搜索用时 0 毫秒
1.
A Poultry-Processing Model for Quantitative Microbiological Risk Assessment   总被引:3,自引:0,他引:3  
A poultry-processing model for a quantitative microbiological risk assessment (QMRA) of campylobacter is presented, which can also be applied to other QMRAs involving poultry processing. The same basic model is applied in each consecutive stage of industrial processing. It describes the effects of inactivation and removal of the bacteria, and the dynamics of cross-contamination in terms of the transfer of campylobacter from the intestines to the carcass surface and the environment, from the carcasses to the environment, and from the environment to the carcasses. From the model it can be derived that, in general, the effect of inactivation and removal is dominant for those carcasses with high initial bacterial loads, and cross-contamination is dominant for those with low initial levels. In other QMRA poultry-processing models, the input-output relationship between the numbers of bacteria on the carcasses is usually assumed to be linear on a logarithmic scale. By including some basic mechanistics, it is shown that this may not be realistic. As nonlinear behavior may affect the predicted effects of risk mitigations; this finding is relevant for risk management. Good knowledge of the variability of bacterial loads on poultry entering the process is important. The common practice in microbiology to only present geometric mean of bacterial counts is insufficient: arithmetic mean are more suitable, in particular, to describe the effect of cross-contamination. The effects of logistic slaughter (scheduled processing) as a risk mitigation strategy are predicted to be small. Some additional complications in applying microbiological data obtained in processing plants are discussed.  相似文献   

2.
Cheese smearing is a complex process and the potential for cross-contamination with pathogenic or undesirable microorganisms is critical. During ripening, cheeses are salted and washed with brine to develop flavor and remove molds that could develop on the surfaces. Considering the potential for cross-contamination of this process in quantitative risk assessments could contribute to a better understanding of this phenomenon and, eventually, improve its control. The purpose of this article is to model the cross-contamination of smear-ripened cheeses due to the smearing operation under industrial conditions. A compartmental, dynamic, and stochastic model is proposed for mechanical brush smearing. This model has been developed to describe the exchange of microorganisms between compartments. Based on the analytical solution of the model equations and on experimental data collected with an industrial smearing machine, we assessed the values of the transfer parameters of the model. Monte Carlo simulations, using the distributions of transfer parameters, provide the final number of contaminated products in a batch and their final level of contamination for a given scenario taking into account the initial number of contaminated cheeses of the batch and their contaminant load. Based on analytical results, the model provides indicators for smearing efficiency and propensity of the process for cross-contamination. Unlike traditional approaches in mechanistic models, our approach captures the variability and uncertainty inherent in the process and the experimental data. More generally, this model could represent a generic base to use in modeling similar processes prone to cross-contamination.  相似文献   

3.
军事指挥信息系统需求工程多视图建模方法   总被引:1,自引:0,他引:1  
方程 《管理学报》2007,4(4):409-413
全面而准确地描述军事指挥信息系统的本质需要从多个角度建立其需求模型。通过以基于Zachm an框架核心思想的多视图框架来考察需求分析活动,在多个不同的方面构建描述需求的模型,从而形成分析军事指挥信息系统需求的建模框架。该框架具有多视角、灵活应对变化、与技术无关以及同现有建模方法有效集成四大优点。  相似文献   

4.
产品平台在基于大规模定制范式的生产系统中的作用建模   总被引:4,自引:0,他引:4  
本文围绕一致的质量、合理的成本、快速、灵活、可靠地大规模交付定制化产品或服务的运营战略,概要讨论了面向大规模定制的三种主要的生产系统(交付定制型、装配定制型、制造定制型)的运营特征,详细分析了产品平台的体系结构,首次提出了基于产品平台实施大规模定制的整合策略,对产品平台在面向大规模定制的生产系统的作用进行了建模。本文的研究工作已经在企业应用实践中得到了验证。  相似文献   

5.
Coastal cities around the world have experienced large costs from major flooding events in recent years. Climate change is predicted to bring an increased likelihood of flooding due to sea level rise and more frequent severe storms. In order to plan future development and adaptation, cities must know the magnitude of losses associated with these events, and how they can be reduced. Often losses are calculated from insurance claims or surveying flood victims. However, this largely neglects the loss due to the disruption of economic activity. We use a forward‐looking dynamic computable general equilibrium model to study how a local economy responds to a flood, focusing on the subsequent recovery/reconstruction. Initial damage is modeled as a shock to the capital stock and recovery requires rebuilding that stock. We apply the model to Vancouver, British Columbia by considering a flood scenario causing total capital damage of $14.6 billion spread across five municipalities. GDP loss relative to a no‐flood scenario is relatively long‐lasting. It is 2.0% ($2.2 billion) in the first year after the flood, 1.7% ($1.9 billion) in the second year, and 1.2% ($1.4 billion) in the fifth year.  相似文献   

6.
Safety analysis of rare events with potentially catastrophic consequences is challenged by data scarcity and uncertainty. Traditional causation‐based approaches, such as fault tree and event tree (used to model rare event), suffer from a number of weaknesses. These include the static structure of the event causation, lack of event occurrence data, and need for reliable prior information. In this study, a new hierarchical Bayesian modeling based technique is proposed to overcome these drawbacks. The proposed technique can be used as a flexible technique for risk analysis of major accidents. It enables both forward and backward analysis in quantitative reasoning and the treatment of interdependence among the model parameters. Source‐to‐source variability in data sources is also taken into account through a robust probabilistic safety analysis. The applicability of the proposed technique has been demonstrated through a case study in marine and offshore industry.  相似文献   

7.
In this paper we compare expectations derived from 10 different human physiologically based pharmacokinetic models for perchloroethylene with data on absorption via inhalation, and concentrations in alveolar air and venous blood. Our most interesting finding is that essentially all of the models show a time pattern of departures of predictions of air and blood levels relative to experimental data that might be corrected by more sophisticated model structures incorporating either (a) heterogeneity of the fat compartment (with respect to either perfusion or partition coefficients or both) or (b) intertissue diffusion of perchloroethylene between the fat and muscle/VRG groups. Similar types of corrections have recently been proposed to reduce analogous anomalies in the fits of pharmacokinetic models to the data for several volatile anesthetics.(17-20) A second finding is that models incorporating resting values for alveolar ventilation in the region of 5.4 L/min seemed to be most compatible with the most reliable set of perchloroethylene uptake data.  相似文献   

8.
Hierarchical decision making is a multidimensional process involving management of multiple objectives (with associated metrics and tradeoffs in terms of costs, benefits, and risks), which span various levels of a large-scale system. The nation is a hierarchical system as it consists multiple classes of decisionmakers and stakeholders ranging from national policymakers to operators of specific critical infrastructure subsystems. Critical infrastructures (e.g., transportation, telecommunications, power, banking, etc.) are highly complex and interconnected. These interconnections take the form of flows of information, shared security, and physical flows of commodities, among others. In recent years, economic and infrastructure sectors have become increasingly dependent on networked information systems for efficient operations and timely delivery of products and services. In order to ensure the stability, sustainability, and operability of our critical economic and infrastructure sectors, it is imperative to understand their inherent physical and economic linkages, in addition to their cyber interdependencies. An interdependency model based on a transformation of the Leontief input-output (I-O) model can be used for modeling: (1) the steady-state economic effects triggered by a consumption shift in a given sector (or set of sectors); and (2) the resulting ripple effects to other sectors. The inoperability metric is calculated for each sector; this is achieved by converting the economic impact (typically in monetary units) into a percentage value relative to the size of the sector. Disruptive events such as terrorist attacks, natural disasters, and large-scale accidents have historically shown cascading effects on both consumption and production. Hence, a dynamic model extension is necessary to demonstrate the interplay between combined demand and supply effects. The result is a foundational framework for modeling cybersecurity scenarios for the oil and gas sector. A hypothetical case study examines a cyber attack that causes a 5-week shortfall in the crude oil supply in the Gulf Coast area.  相似文献   

9.
There is a need for plant-specific distributions of incidence and failure rates rather than distributions from pooled data which are based on the "common incidence rate" assumption. The so-called superpopulation model satisfies this need through a practically appealing approach that accounts for the variability over the population of plants. Unfortunately, the chosen order in which the integrals with respect to the individual plant rates λi, ( i = 0, 1…, m ) and the parameters a , β of the Γ-population distribution are solved seems to drive the solution close to the common incidence rate distribution. It is shown that the solution obtained from interchanging the order and solving the integrals with respect to the individual plant rates by Monte Carlo simulation very quickly provides the plant specific distribution. This differing solution behaviour may be due to the lack of uniform convergence over (α, β, λI, ( i = 1,…, m ))-space. Examples illustrate the difference that may be observed.  相似文献   

10.
In recent years physiologically based pharmacokinetic models have come to play an increasingly important role in risk assessment for carcinogens. The hope is that they can help open the black box between external exposure and carcinogenic effects to experimental observations, and improve both high-dose to low-dose and interspecies projections of risk. However, to date, there have been only relatively preliminary efforts to assess the uncertainties in current modeling results. In this paper we compare the physiologically based pharmacokinetic models (and model predictions of risk-related overall metabolism) that have been produced by seven different sets of authors for perchloroethylene (tetrachloroethylene). The most striking conclusion from the data is that most of the differences in risk-related model predictions are attributable to the choice of the data sets used for calibrating the metabolic parameters. Second, it is clear that the bottom-line differences among the model predictions are appreciable. Overall, the ratios of low-dose human to bioassay rodent metabolism spanned a 30-fold range for the six available human/rat comparisons, and the seven predicted ratios of low-dose human to bioassay mouse metabolism spanned a 13-fold range. (The greater range for the rat/human comparison is attributable to a structural assumption by one author group of competing linear and saturable pathways, and their conclusion that the dangerous saturable pathway constitutes a minor fraction of metabolism in rats.) It is clear that there are a number of opportunities for modelers to make different choices of model structure, interpretive assumptions, and calibrating data in the process of constructing pharmacokinetic models for use in estimating "delivered" or "biologically effective" dose for carcinogenesis risk assessments. We believe that in presenting the results of such modeling studies, it is important for researchers to explore the results of alternative, reasonably likely approaches for interpreting the available data--and either show that any conclusions they make are relatively insensitive to particular interpretive choices, or to acknowledge the differences in conclusions that would result from plausible alternative views of the world.  相似文献   

11.
A detailed mathematical modeling framework for the risk of airborne infectious disease transmission in indoor spaces was developed to enable mathematical analysis of experiments conducted at the Airborne Infections Research (AIR) facility, eMalahleni, South Africa. A model was built using this framework to explore possible causes of why an experiment at the AIR facility did not produce expected results. The experiment was conducted at the AIR facility from August 31, 2015 to December 4, 2015, in which the efficacy of upper room germicidal ultraviolet (GUV) irradiation as an environmental control was tested. However, the experiment did not produce the expected outcome of having fewer infections in the test animal room than the control room. The simulation results indicate that dynamic effects, caused by switching the GUV lights, power outages, or introduction of new patients, did not result in the unexpected outcomes. However, a sensitivity analysis highlights that significant uncertainty exists with risk of transmission predictions based on current measurement practices, due to the reliance on large viable literature ranges for parameters.  相似文献   

12.
《Risk analysis》2018,38(8):1718-1737
We developed a probabilistic mathematical model for the postharvest processing of leafy greens focusing on Escherichia coli O157:H7 contamination of fresh‐cut romaine lettuce as the case study. Our model can (i) support the investigation of cross‐contamination scenarios, and (ii) evaluate and compare different risk mitigation options. We used an agent‐based modeling framework to predict the pathogen prevalence and levels in bags of fresh‐cut lettuce and quantify spread of E. coli O157:H7 from contaminated lettuce to surface areas of processing equipment. Using an unbalanced factorial design, we were able to propagate combinations of random values assigned to model inputs through different processing steps and ranked statistically significant inputs with respect to their impacts on selected model outputs. Results indicated that whether contamination originated on incoming lettuce heads or on the surface areas of processing equipment, pathogen prevalence among bags of fresh‐cut lettuce and batches was most significantly impacted by the level of free chlorine in the flume tank and frequency of replacing the wash water inside the tank. Pathogen levels in bags of fresh‐cut lettuce were most significantly influenced by the initial levels of contamination on incoming lettuce heads or surface areas of processing equipment. The influence of surface contamination on pathogen prevalence or levels in fresh‐cut bags depended on the location of that surface relative to the flume tank. This study demonstrates that developing a flexible yet mathematically rigorous modeling tool, a “virtual laboratory,” can provide valuable insights into the effectiveness of individual and combined risk mitigation options.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号