首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 0 毫秒
1.
Current methods for cancer risk assessment result in single values, without any quantitative information on the uncertainties in these values. Therefore, single risk values could easily be overinterpreted. In this study, we discuss a full probabilistic cancer risk assessment approach in which all the generally recognized uncertainties in both exposure and hazard assessment are quantitatively characterized and probabilistically evaluated, resulting in a confidence interval for the final risk estimate. The methodology is applied to three example chemicals (aflatoxin, N‐nitrosodimethylamine, and methyleugenol). These examples illustrate that the uncertainty in a cancer risk estimate may be huge, making single value estimates of cancer risk meaningless. Further, a risk based on linear extrapolation tends to be lower than the upper 95% confidence limit of a probabilistic risk estimate, and in that sense it is not conservative. Our conceptual analysis showed that there are two possible basic approaches for cancer risk assessment, depending on the interpretation of the dose‐incidence data measured in animals. However, it remains unclear which of the two interpretations is the more adequate one, adding an additional uncertainty to the already huge confidence intervals for cancer risk estimates.  相似文献   

2.
Probabilistic risk assessment (PRA) is a relatively new tool in the nuclear industry. The Reactor Safety Study started the present trend of conducting PRAs for nuclear power plants when it was published in 1975. Now, nine years later, those in the industry currently using PRA techniques are frequently asked the same question: Why should the nuclear utility industry, with so many accepted analytical tools already available, invest the time and manpower to develop a new technique with so many uncertainties?  相似文献   

3.
This article presents a discourse on the incorporation of organizational factors into probabilistic risk assessment (PRA)/probabilistic safety assessment (PSA), a topic of debate since the 1980s that has spurred discussions among industry, regulatory agencies, and the research community. The main contributions of this article include (1) identifying the four key open questions associated with this topic; (2) framing ongoing debates by considering differing perspectives around each question; (3) offering a categorical review of existing studies on this topic to justify the selection of each question and to analyze the challenges related to each perspective; and (4) highlighting the directions of research required to reach a final resolution for each question. The four key questions are: (I) How significant is the contribution of organizational factors to accidents and incidents? (II) How critical, with respect to improving risk assessment, is the explicit incorporation of organizational factors into PRA? (III) What theoretical bases are needed for explicit incorporation of organizational factors into PRA? (IV) What methodological bases are needed for the explicit incorporation of organizational factors into PRA? Questions I and II mainly analyze PRA literature from the nuclear domain. For Questions III and IV, a broader review and categorization is conducted of those existing cross-disciplinary studies that have evaluated the effects of organizational factors on safety (not solely PRA-based) to shed more light on future research needs.  相似文献   

4.
The intake of Cd, methyl‐Hg, and Pb through consumption of black scabbardfish (BSF) (Aphanopus carbo) in Portugal as well as the associated probability of exceeding the respective provisional tolerable weekly intakes (PTWIs) was estimated. For this purpose, the contamination levels of heavy metals in this fish species were combined with constructed consumption scenarios or with a hypothesized consumption distribution. Whereas Cd and Pb posed no serious risk, the consumption of at least one portion of BSF per month as well as the hypothetical study in the Portuguese population produced nonnegligible probabilities of surpassing the PTWI for Me‐Hg. Risk assessment for Portuguese consumers revealed a higher risk regarding Me‐Hg, 1.19% and 1.81% with the plug‐in (PI) and the tail estimation (TE) estimators, respectively. On the other hand, the risk for Cd and Pb was less than 1 in 100,000. TE was more realistic and accurate for Cd and Pb. Concerning Me‐Hg, TE and PI estimators produced similar results. Furthermore, the limitations of a deterministic approach were shown.  相似文献   

5.
Concern about the degree of uncertainty and potential conservatism in deterministic point estimates of risk has prompted researchers to turn increasingly to probabilistic methods for risk assessment. With Monte Carlo simulation techniques, distributions of risk reflecting uncertainty and/or variability are generated as an alternative. In this paper the compounding of conservatism(1) between the level associated with point estimate inputs selected from probability distributions and the level associated with the deterministic value of risk calculated using these inputs is explored. Two measures of compounded conservatism are compared and contrasted. The first measure considered, F , is defined as the ratio of the risk value, R d, calculated deterministically as a function of n inputs each at the j th percentile of its probability distribution, and the risk value, R j that falls at the j th percentile of the simulated risk distribution (i.e., F=Rd/Rj). The percentile of the simulated risk distribution which corresponds to the deterministic value, Rd , serves as a second measure of compounded conservatism. Analytical results for simple products of lognormal distributions are presented. In addition, a numerical treatment of several complex cases is presented using five simulation analyses from the literature to illustrate. Overall, there are cases in which conservatism compounds dramatically for deterministic point estimates of risk constructed from upper percentiles of input parameters, as well as those for which the effect is less notable. The analytical and numerical techniques discussed are intended to help analysts explore the factors that influence the magnitude of compounding conservatism in specific cases.  相似文献   

6.
A quantitative microbial risk assessment (QMRA) according to the Codex Alimentarius Principles is conducted to evaluate the risk of human salmonellosis through household consumption of fresh minced pork meat in Belgium. The quantitative exposure assessment is carried out by building a modular risk model, called the METZOON-model, which covers the pork production from farm to fork. In the METZOON-model, the food production pathway is split up in six consecutive modules: (1) primary production, (2) transport and lairage, (3) slaughterhouse, (4) postprocessing, (5) distribution and storage, and (6) preparation and consumption. All the modules are developed to resemble as closely as possible the Belgian situation, making use of the available national data. Several statistical refinements and improved modeling techniques are proposed. The model produces highly realistic results. The baseline predicted number of annual salmonellosis cases is 20,513 ( SD 9061.45). The risk is estimated higher for the susceptible population (estimate  4.713 × 10−5; SD 1.466 × 10−5  ) compared to the normal population  (estimate 7.704 × 10−6; SD 5.414 × 10−6)  and is mainly due to undercooking and to a smaller extent to cross-contamination in the kitchen via cook's hands.  相似文献   

7.
This article describes the evolution of the process for assessing the hazards of a geologic disposal system for radioactive waste and, similarly, nuclear power reactors, and the relationship of this process with other assessments of risk, particularly assessments of hazards from manufactured carcinogenic chemicals during use and disposal. This perspective reviews the common history of scientific concepts for risk assessment developed until the 1950s. Computational tools and techniques developed in the late 1950s and early 1960s to analyze the reliability of nuclear weapon delivery systems were adopted in the early 1970s for probabilistic risk assessment of nuclear power reactors, a technology for which behavior was unknown. In turn, these analyses became an important foundation for performance assessment of nuclear waste disposal in the late 1970s. The evaluation of risk to human health and the environment from chemical hazards is built on methods for assessing the dose response of radionuclides in the 1950s. Despite a shared background, however, societal events, often in the form of legislation, have affected the development path for risk assessment for human health, producing dissimilarities between these risk assessments and those for nuclear facilities. An important difference is the regulator's interest in accounting for uncertainty.  相似文献   

8.
《Risk analysis》2018,38(6):1202-1222
Toxoplasmosis is a cosmopolitan disease and has a broad range of hosts, including humans and several wild and domestic animals. The human infection is mostly acquired through the consumption of contaminated food and pork meat has been recognized as one of the major sources of transmission. There are, however, certain fundamental differences between countries; therefore, the present study specifically aims to evaluate the exposure of the Italian population to Toxoplasma gondii through the ingestion of several types of pork meat products habitually consumed in Italy and to estimate the annual number of human infections within two subgroups of the population. A quantitative risk assessment model was built for this reason and was enriched with new elements in comparison to other similar risk assessments in order to enhance its accuracy. Sensitivity analysis and two alternative scenarios were implemented to identify the factors that have the highest impact on risk and to simulate different plausible conditions, respectively. The estimated overall average number of new infections per year among adults is 12,513 and 92 for pregnant women. The baseline model showed that almost all these infections are associated with the consumption of fresh meat cuts and preparations (mean risk of infection varied between 4.5 × 10−5 and 5.5 × 10−5) and only a small percentage is due to fermented sausages/salami. On the contrary, salt‐cured meat products seem to pose minor risk but further investigations are needed to clarify still unclear aspects. Among all the considered variables, cooking temperature and bradyzoites’ concentration in muscle impacted most the risk.  相似文献   

9.
The current quantitative risk assessment model followed the framework proposed by the Codex Alimentarius to provide an estimate of the risk of human salmonellosis due to consumption of chicken breasts which were bought from Canadian retail stores and prepared in Canadian domestic kitchens. The model simulated the level of Salmonella contamination on chicken breasts throughout the retail‐to‐table pathway. The model used Canadian input parameter values, where available, to represent risk of salmonellosis. From retail until consumption, changes in the concentration of Salmonella on each chicken breast were modeled using equations for growth and inactivation. The model predicted an average of 318 cases of salmonellosis per 100,000 consumers per year. Potential reasons for this overestimation were discussed. A sensitivity analysis showed that concentration of Salmonella on chicken breasts at retail and food hygienic practices in private kitchens such as cross‐contamination due to not washing cutting boards (or utensils) and hands after handling raw meat along with inadequate cooking contributed most significantly to the risk of human salmonellosis. The outcome from this model emphasizes that responsibility for protection from Salmonella hazard on chicken breasts is a shared responsibility. Data needed for a comprehensive Canadian Salmonella risk assessment were identified for future research.  相似文献   

10.
A method is proposed for integrated probabilistic risk assessment where exposure assessment and hazard characterization are both included in a probabilistic way. The aim is to specify the probability that a random individual from a defined (sub)population will have an exposure high enough to cause a particular health effect of a predefined magnitude, the critical effect size ( CES ). The exposure level that results in exactly that CES in a particular person is that person's individual critical effect dose ( ICED ). Individuals in a population typically show variation, both in their individual exposure ( IEXP ) and in their ICED . Both the variation in IEXP and the variation in ICED are quantified in the form of probability distributions. Assuming independence between both distributions, they are combined (by Monte Carlo) into a distribution of the individual margin of exposure ( IMoE ). The proportion of the IMoE distribution below unity is the probability of critical exposure ( PoCE ) in the particular (sub)population. Uncertainties involved in the overall risk assessment (i.e., both regarding exposure and effect assessment) are quantified using Monte Carlo and bootstrap methods. This results in an uncertainty distribution for any statistic of interest, such as the probability of critical exposure ( PoCE ). The method is illustrated based on data for the case of dietary exposure to the organophosphate acephate. We present plots that concisely summarize the probabilistic results, retaining the distinction between variability and uncertainty. We show how the relative contributions from the various sources of uncertainty involved may be quantified.  相似文献   

11.
Whether and to what extent contaminated sites harm ecologic and human health are topics of considerable interest, but also considerable uncertainty. Several federal and state agencies have approved the use of some or many aspects of probabilistic risk assessment (PRA), but its site-specific application has often been limited to high-profile sites and large projects. Nonetheless, times are changing: newly developed software tools, and recent federal and state guidance documents formalizing PRA procedures, now make PRA a readily available method of analysis for even small-scale projects. This article presents and discusses a broad review of PRA literature published since 2000.  相似文献   

12.
The differences between probabilistic risk assessment (PRA) and safety analysis (SA) are discussed, and it is shown that PRA is more suitable than SA for determining the acceptability of a technology. Since a PRA by the fault tree-event tree analysis method used for reactor safety studies does not seem to be practical for buried waste, an alternative approach is suggested using geochemical analogs. This method is illustrated for the cases of high-level and low-level radioactive waste and for chemical carcinogens released in coal burning.  相似文献   

13.
Physiologically‐based pharmacokinetic (PBPK) models are often submitted to or selected by agencies, such as the U.S. Environmental Protection Agency (U.S. EPA) and Agency for Toxic Substances and Disease Registry, for consideration for application in human health risk assessment (HHRA). Recently, U.S. EPA evaluated the human PBPK models for perchlorate and radioiodide for their ability to estimate the relative sensitivity of perchlorate inhibition on thyroidal radioiodide uptake for various population groups and lifestages. The most well‐defined mode of action of the environmental contaminant, perchlorate, is competitive inhibition of thyroidal iodide uptake by the sodium‐iodide symporter (NIS). In this analysis, a six‐step framework for PBPK model evaluation was followed, and with a few modifications, the models were determined to be suitable for use in HHRA to evaluate relative sensitivity among human lifestages. Relative sensitivity to perchlorate was determined by comparing the PBPK model predicted percent inhibition of thyroidal radioactive iodide uptake (RAIU) by perchlorate for different lifestages. A limited sensitivity analysis indicated that model parameters describing urinary excretion of perchlorate and iodide were particularly important in prediction of RAIU inhibition; therefore, a range of biologically plausible values available in the peer‐reviewed literature was evaluated. Using the updated PBPK models, the greatest sensitivity to RAIU inhibition was predicted to be the near‐term fetus (gestation week 40) compared to the average adult and other lifestages; however, when exposure factors were taken into account, newborns were found to be populations that need further evaluation and consideration in a risk assessment for perchlorate.  相似文献   

14.
Today, there is a worldwide infrastructure of offshore structure systems that include fixed, floating, and mobile platforms, pipelines, and ships. Background on current and future trends in development of comprehensive programs to help improve the quality and reliability of offshore structure systems are discussed. A combination of proactive, reactive, and interactive risk assessment and management approaches have been developed and applied. Two risk assessment and management instruments are detailed in this article: a qualitative Quality Management Assessment System (QMAS), and a quantitative System Risk Analysis System (SYRAS). Application of QMAS to produce human and organizational performance shaping factors that are used as input to SYRAS is discussed.  相似文献   

15.
In the nuclear power industry, Level 3 probabilistic risk assessment (PRA) is used to estimate damage to public health and the environment if a severe accident leads to large radiological release. Current Level 3 PRA does not have an explicit inclusion of social factors and, therefore, it is not possible to perform importance ranking of social factors for risk‐informing emergency preparedness, planning, and response (EPPR). This article offers a methodology for adapting the concept of social vulnerability, commonly used in natural hazard research, in the context of a severe nuclear power plant accident. The methodology has four steps: (1) calculating a hazard‐independent social vulnerability index for the local population; (2) developing a location‐specific representation of the maximum radiological hazard estimated from current Level 3 PRA, in a geographic information system (GIS) environment; (3) developing a GIS‐based socio‐technical risk map by combining the social vulnerability index and the location‐specific radiological hazard; and (4) conducting a risk importance measure analysis to rank the criticality of social factors based on their contribution to the socio‐technical risk. The methodology is applied using results from the 2012 Surry Power Station state‐of‐the‐art reactor consequence analysis. A radiological hazard model is generated from MELCOR accident consequence code system, translated into a GIS environment, and combined with the Center for Disease Control social vulnerability index (SVI). This research creates an opportunity to explicitly consider and rank the criticality of location‐specific SVI themes based on their influence on risk, providing input for EPPR.  相似文献   

16.
This article demonstrates application of sensitivity analysis to risk assessment models with two-dimensional probabilistic frameworks that distinguish between variability and uncertainty. A microbial food safety process risk (MFSPR) model is used as a test bed. The process of identifying key controllable inputs and key sources of uncertainty using sensitivity analysis is challenged by typical characteristics of MFSPR models such as nonlinearity, thresholds, interactions, and categorical inputs. Among many available sensitivity analysis methods, analysis of variance (ANOVA) is evaluated in comparison to commonly used methods based on correlation coefficients. In a two-dimensional risk model, the identification of key controllable inputs that can be prioritized with respect to risk management is confounded by uncertainty. However, as shown here, ANOVA provided robust insights regarding controllable inputs most likely to lead to effective risk reduction despite uncertainty. ANOVA appropriately selected the top six important inputs, while correlation-based methods provided misleading insights. Bootstrap simulation is used to quantify uncertainty in ranks of inputs due to sampling error. For the selected sample size, differences in F values of 60% or more were associated with clear differences in rank order between inputs. Sensitivity analysis results identified inputs related to the storage of ground beef servings at home as the most important. Risk management recommendations are suggested in the form of a consumer advisory for better handling and storage practices.  相似文献   

17.
Various methods for risk characterization have been developed using probabilistic approaches. Data on Vietnamese farmers are available for the comparison of outcomes for risk characterization using different probabilistic methods. This article addresses the health risk characterization of chlorpyrifos using epidemiological dose‐response data and probabilistic techniques obtained from a case study with rice farmers in Vietnam. Urine samples were collected from farmers and analyzed for trichloropyridinol (TCP), which was converted into absorbed daily dose of chlorpyrifos. Adverse health response doses due to chlorpyrifos exposure were collected from epidemiological studies to develop dose‐adverse health response relationships. The health risk of chlorpyrifos was quantified using hazard quotient (HQ), Monte Carlo simulation (MCS), and overall risk probability (ORP) methods. With baseline (prior to pesticide spraying) and lifetime exposure levels (over a lifetime of pesticide spraying events), the HQ ranged from 0.06 to 7.1. The MCS method indicated less than 0.05% of the population would be affected while the ORP method indicated that less than 1.5% of the population would be adversely affected. With postapplication exposure levels, the HQ ranged from 1 to 32.5. The risk calculated by the MCS method was that 29% of the population would be affected, and the risk calculated by ORP method was 33%. The MCS and ORP methods have advantages in risk characterization due to use of the full distribution of data exposure as well as dose response, whereas HQ methods only used the exposure data distribution. These evaluations indicated that single‐event spraying is likely to have adverse effects on Vietnamese rice farmers.  相似文献   

18.
We conducted a regional‐scale integrated ecological and human health risk assessment by applying the relative risk model with Bayesian networks (BN‐RRM) to a case study of the South River, Virginia mercury‐contaminated site. Risk to four ecological services of the South River (human health, water quality, recreation, and the recreational fishery) was evaluated using a multiple stressor–multiple endpoint approach. These four ecological services were selected as endpoints based on stakeholder feedback and prioritized management goals for the river. The BN‐RRM approach allowed for the calculation of relative risk to 14 biotic, human health, recreation, and water quality endpoints from chemical and ecological stressors in five risk regions of the South River. Results indicated that water quality and the recreational fishery were the ecological services at highest risk in the South River. Human health risk for users of the South River was low relative to the risk to other endpoints. Risk to recreation in the South River was moderate with little spatial variability among the five risk regions. Sensitivity and uncertainty analysis identified stressors and other parameters that influence risk for each endpoint in each risk region. This research demonstrates a probabilistic approach to integrated ecological and human health risk assessment that considers the effects of chemical and ecological stressors across the landscape.  相似文献   

19.
This article models flood occurrence probabilistically and its risk assessment. It incorporates atmospheric parameters to forecast rainfall in an area. This measure of precipitation, together with river and ground parameters, serve as parameters in the model to predict runoff and subsequently inundation depth of an area. The inundation depth acts as a guide for predicting flood proneness and associated hazard. The vulnerability owing to flood has been analyzed as social vulnerability ( V S ) , vulnerability to property ( V P ) , and vulnerability to the location in terms of awareness ( V A ) . The associated risk has been estimated for each area. The distribution of risk values can be used to classify every area into one of the six risk zones—namely, very low risk, low risk, moderately low risk, medium risk, high risk, and very high risk. The prioritization regarding preparedness, evacuation planning, or distribution of relief items should be guided by the range on the risk scale within which the area under study falls. The flood risk assessment model framework has been tested on a real‐life case study. The flood risk indices for each of the municipalities in the area under study have been calculated. The risk indices and hence the flood risk zone under which a municipality is expected to lie would alter every day. The appropriate authorities can then plan ahead in terms of preparedness to combat the impending flood situation in the most critical and vulnerable areas.  相似文献   

20.
Context in the Risk Assessment of Digital Systems   总被引:1,自引:0,他引:1  
As the use of digital computers for instrumentation and control of safety-critical systems has increased, there has been a growing debate over the issue of whether probabilistic risk assessment techniques can be applied to these systems. This debate has centered on the issue of whether software failures can be modeled probabilistically. This paper describes a context-based approach to software risk assessment that explicitly recognizes the fact that the behavior of software is not probabilistic. The source of the perceived uncertainty in its behavior results from both the input to the software as well as the application and environment in which the software is operating. Failures occur as the result of encountering some context for which the software was not properly designed, as opposed to the software simply failing randomly. The paper elaborates on the concept of error-forcing context as it applies to software. It also illustrates a methodology which utilizes event trees, fault trees, and the Dynamic Flowgraph Methodology (DFM) to identify error-forcing contexts for software in the form of fault tree prime implicants.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号