首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到13条相似文献,搜索用时 0 毫秒
1.
Terje Aven 《Risk analysis》2010,30(3):354-360
It is common perspective in risk analysis that there are two kinds of uncertainties: i) variability as resulting from heterogeneity and stochasticity (aleatory uncertainty) and ii) partial ignorance or epistemic uncertainties resulting from systematic measurement error and lack of knowledge. Probability theory is recognized as the proper tool for treating the aleatory uncertainties, but there are different views on what is the best approach for describing partial ignorance and epistemic uncertainties. Subjective probabilities are often used for representing this type of ignorance and uncertainties, but several alternative approaches have been suggested, including interval analysis, probability bound analysis, and bounds based on evidence theory. It is argued that probability theory generates too precise results when the background knowledge of the probabilities is poor. In this article, we look more closely into this issue. We argue that this critique of probability theory is based on a conception of risk assessment being a tool to objectively report on the true risk and variabilities. If risk assessment is seen instead as a method for describing the analysts’ (and possibly other stakeholders’) uncertainties about unknown quantities, the alternative approaches (such as the interval analysis) often fail in providing the necessary decision support.  相似文献   

2.
We propose 14 principles of good practice to assist people in performing and reviewing probabilistic or Monte Carlo risk assessments, especially in the context of the federal and state statutes concerning chemicals in the environment. Monte Carlo risk assessments for hazardous waste sites that follow these principles will be easier to understand, will explicitly distinguish assumptions from data, and will consider and quantify effects that could otherwise lead to misinterpretation of the results. The proposed principles are neither mutually exclusive nor collectively exhaustive. We think and hope that these principles will evolve as new ideas arise and come into practice.  相似文献   

3.
The individual plant analyses in the U.S. Nuclear Regulatory Commission's reassessment of the risk from commercial nuclear power plants (NUREG-1150) consist of four parts: systems analysis, accident-progression analysis, source-term analysis, and consequence analysis. Careful definition of the interfaces between these parts is necessary for both information flow and computational efficiency. This paper describes the procedure used to define the interface between the source-term analysis and the consequence analysis. This interface is accomplished by forming groups of source terms with similar properties and then performing one set of MACCS calculations for each group.  相似文献   

4.
In spite of increased attention to quality and efforts to provide safe medical care, adverse events (AEs) are still frequent in clinical practice. Reports from various sources indicate that a substantial number of hospitalized patients suffer treatment‐caused injuries while in the hospital. While risk cannot be entirely eliminated from health‐care activities, an important goal is to develop effective and durable mitigation strategies to render the system “safer.” In order to do this, though, we must develop models that comprehensively and realistically characterize the risk. In the health‐care domain, this can be extremely challenging due to the wide variability in the way that health‐care processes and interventions are executed and also due to the dynamic nature of risk in this particular domain. In this study, we have developed a generic methodology for evaluating dynamic changes in AE risk in acute care hospitals as a function of organizational and nonorganizational factors, using a combination of modeling formalisms. First, a system dynamics (SD) framework is used to demonstrate how organizational‐level and policy‐level contributions to risk evolve over time, and how policies and decisions may affect the general system‐level contribution to AE risk. It also captures the feedback of organizational factors and decisions over time and the nonlinearities in these feedback effects. SD is a popular approach to understanding the behavior of complex social and economic systems. It is a simulation‐based, differential equation modeling tool that is widely used in situations where the formal model is complex and an analytical solution is very difficult to obtain. Second, a Bayesian belief network (BBN) framework is used to represent patient‐level factors and also physician‐level decisions and factors in the management of an individual patient, which contribute to the risk of hospital‐acquired AE. BBNs are networks of probabilities that can capture probabilistic relations between variables and contain historical information about their relationship, and are powerful tools for modeling causes and effects in many domains. The model is intended to support hospital decisions with regard to staffing, length of stay, and investments in safety, which evolve dynamically over time. The methodology has been applied in modeling the two types of common AEs: pressure ulcers and vascular‐catheter‐associated infection, and the models have been validated with eight years of clinical data and use of expert opinion.  相似文献   

5.
Risk criterion is a term that may distinguish between what is considered as an acceptable level of safety and what is not. One of the ways of determining quantitative risk criteria for temporary changes in a nuclear power plant considering probabilistic safety assessment is presented. Risk criteria are based on timing and on time duration of the change. Several examples of temporary changes in a nuclear power plant were examined to evaluate the criteria. Results show that it is possible to determine a set of risk criteria for temporary changes. Risk criteria can represent a standpoint for risk-informed decision making.  相似文献   

6.
In this article, we present a methodology to assess the risk incurred by a participant in an activity involving danger of injury. The lack of high-quality historical data for the case considered prevented us from constructing a sufficiently detailed statistical model. It was therefore decided to generate a risk assessment model based on expert judgment. The methodology is illustrated in a real case context: the assessment of risk to participants in a San Fermin bull-run in Pamplona (Spain). The members of the panel of "experts on the bull-run" represented very different perspectives on the phenomenon: runners, surgeons and other health care personnel, journalists, civil defense workers, security staff, organizers, herdsmen, authors of books on the bull-run, etc. We consulted 55 experts. Our methodology includes the design of a survey instrument to elicit the experts' views and the statistical and mathematical procedures used to aggregate their subjective opinions.  相似文献   

7.
An omnibus spending bill in 2014 directed the Department of Energy to analyze how effectively Department of Energy (DOE) identifies, programs, and executes its plans to address public health and safety risks that remain as part of DOE's remaining environmental cleanup liabilities. A committee identified two dozen issues and associated recommendations for the DOE, other federal agencies, and the U.S. Congress to consider, as well as other stakeholders such as states and tribal nations. In regard to risk assessment, the committee described a risk review process that uses available data, expert experience, identifies major data gaps, permits input from key stakeholders, and creates an ordered set of risks based on what is known. Probabilistic risk assessments could be a follow‐up from these risk reviews. In regard to risk management, the states, in particular, have become major drivers of how resources are driven. States use different laws, different priorities, and challenge DOE's policies in different ways. Land use decisions vary, technology choices are different, and other notable variations are apparent. The cost differences associated with these differences are marked. The net result is that resources do not necessarily go to the most prominent human health and safety risks, as seen from the national level.  相似文献   

8.
Various methods for risk characterization have been developed using probabilistic approaches. Data on Vietnamese farmers are available for the comparison of outcomes for risk characterization using different probabilistic methods. This article addresses the health risk characterization of chlorpyrifos using epidemiological dose‐response data and probabilistic techniques obtained from a case study with rice farmers in Vietnam. Urine samples were collected from farmers and analyzed for trichloropyridinol (TCP), which was converted into absorbed daily dose of chlorpyrifos. Adverse health response doses due to chlorpyrifos exposure were collected from epidemiological studies to develop dose‐adverse health response relationships. The health risk of chlorpyrifos was quantified using hazard quotient (HQ), Monte Carlo simulation (MCS), and overall risk probability (ORP) methods. With baseline (prior to pesticide spraying) and lifetime exposure levels (over a lifetime of pesticide spraying events), the HQ ranged from 0.06 to 7.1. The MCS method indicated less than 0.05% of the population would be affected while the ORP method indicated that less than 1.5% of the population would be adversely affected. With postapplication exposure levels, the HQ ranged from 1 to 32.5. The risk calculated by the MCS method was that 29% of the population would be affected, and the risk calculated by ORP method was 33%. The MCS and ORP methods have advantages in risk characterization due to use of the full distribution of data exposure as well as dose response, whereas HQ methods only used the exposure data distribution. These evaluations indicated that single‐event spraying is likely to have adverse effects on Vietnamese rice farmers.  相似文献   

9.
Currently, there is a growing preference for convenience food products, such as ready-to-eat (RTE) foods, associated with long refrigerated shelf-lives, not requiring a heat treatment prior to consumption. Because Listeria monocytogenes is able to grow at refrigeration temperatures, inconsistent temperatures during production, distribution, and at consumer's household may allow for the pathogen to thrive, reaching unsafe limits. L. monocytogenes is the causative agent of listeriosis, a rare but severe human illness, with high fatality rates, transmitted almost exclusively by food consumption. With the aim of assessing the quantitative microbial risk of L. monocytogenes in RTE chicken salads, a challenge test was performed. Salads were inoculated with a three-strain mixture of cold-adapted L. monocytogenes and stored at 4, 12, and 16 °C for eight days. Results revealed that the salad was able to support L. monocytogenes’ growth, even at refrigeration temperatures. The Baranyi primary model was fitted to microbiological data to estimate the pathogen's growth kinetic parameters. Temperature effect on the maximum specific growth rate (μmax) was modeled using a square-root-type model. Storage temperature significantly influenced μmax of L. monocytogenes (p < 0.05). These predicted growth models for L. monocytogenes were subsequently used to develop a quantitative microbial risk assessment, estimating a median number of 0.00008726 listeriosis cases per year linked to the consumption of these RTE salads. Sensitivity analysis considering different time–temperature scenarios indicated a very low median risk per portion (<−7 log), even if the assessed RTE chicken salad was kept in abuse storage conditions.  相似文献   

10.
《Risk analysis》2018,38(1):17-30
The extent of economic losses due to a natural hazard and disaster depends largely on the spatial distribution of asset values in relation to the hazard intensity distribution within the affected area. Given that statistical data on asset value are collected by administrative units in China, generating spatially explicit asset exposure maps remains a key challenge for rapid postdisaster economic loss assessment. The goal of this study is to introduce a top‐down (or downscaling) approach to disaggregate administrative‐unit level asset value to grid‐cell level. To do so, finding the highly correlated “surrogate” indicators is the key. A combination of three data sets—nighttime light grid, LandScan population grid, and road density grid, is used as ancillary asset density distribution information for spatializing the asset value. As a result, a high spatial resolution asset value map of China for 2015 is generated. The spatial data set contains aggregated economic value at risk at 30 arc‐second spatial resolution. Accuracy of the spatial disaggregation reflects redistribution errors introduced by the disaggregation process as well as errors from the original ancillary data sets. The overall accuracy of the results proves to be promising. The example of using the developed disaggregated asset value map in exposure assessment of watersheds demonstrates that the data set offers immense analytical flexibility for overlay analysis according to the hazard extent. This product will help current efforts to analyze spatial characteristics of exposure and to uncover the contributions of both physical and social drivers of natural hazard and disaster across space and time.  相似文献   

11.
《Risk analysis》2018,38(7):1455-1473
Recently, growing earthquake activity in the northeastern Netherlands has aroused considerable concern among the 600,000 provincial inhabitants. There, at 3 km deep, the rich Groningen gas field extends over 900 km2 and still contains about 600 of the original 2,800 billion cubic meters (bcm). Particularly after 2001, earthquakes have increased in number, magnitude (M, on the logarithmic Richter scale), and damage to numerous buildings. The man‐made nature of extraction‐induced earthquakes challenges static notions of risk, complicates formal risk assessment, and questions familiar conceptions of acceptable risk. Here, a 26‐year set of 294 earthquakes with M ≥ 1.5 is statistically analyzed in relation to increasing cumulative gas extraction since 1963. Extrapolations from a fast‐rising trend over 2001–2013 indicate that—under “business as usual”—around 2021 some 35 earthquakes with M ≥ 1.5 might occur annually, including four with M ≥ 2.5 (ten‐fold stronger), and one with M ≥ 3.5 every 2.5 years. Given this uneasy prospect, annual gas extraction has been reduced from 54 bcm in 2013 to 24 bcm in 2017. This has significantly reduced earthquake activity, so far. However, when extraction is stabilized at 24 bcm per year for 2017–2021 (or 21.6 bcm, as judicially established in Nov. 2017), the annual number of earthquakes would gradually increase again, with an expected all‐time maximum M ≈ 4.5. Further safety management may best follow distinct stages of seismic risk generation, with moderation of gas extraction and massive (but late and slow) building reinforcement as outstanding strategies. Officially, “acceptable risk” is mainly approached by quantification of risk (e.g., of fatal building collapse) for testing against national safety standards, but actual (local) risk estimation remains problematic. Additionally important are societal cost–benefit analysis, equity considerations, and precautionary restraint. Socially and psychologically, deliberate attempts are made to improve risk communication, reduce public anxiety, and restore people's confidence in responsible experts and policymakers.  相似文献   

12.
This article presents a new methodology to implement the concept of equity in regional earthquake risk mitigation programs using an optimization framework. It presents a framework that could be used by decisionmakers (government and authorities) to structure budget allocation strategy toward different seismic risk mitigation measures, i.e., structural retrofitting for different building structural types in different locations and planning horizons. A two‐stage stochastic model is developed here to seek optimal mitigation measures based on minimizing mitigation expenditures, reconstruction expenditures, and especially large losses in highly seismically active countries. To consider fairness in the distribution of financial resources among different groups of people, the equity concept is incorporated using constraints in model formulation. These constraints limit inequity to the user‐defined level to achieve the equity‐efficiency tradeoff in the decision‐making process. To present practical application of the proposed model, it is applied to a pilot area in Tehran, the capital city of Iran. Building stocks, structural vulnerability functions, and regional seismic hazard characteristics are incorporated to compile a probabilistic seismic risk model for the pilot area. Results illustrate the variation of mitigation expenditures by location and structural type for buildings. These expenditures are sensitive to the amount of available budget and equity consideration for the constant risk aversion. Most significantly, equity is more easily achieved if the budget is unlimited. Conversely, increasing equity where the budget is limited decreases the efficiency. The risk‐return tradeoff, equity‐reconstruction expenditures tradeoff, and variation of per‐capita expected earthquake loss in different income classes are also presented.  相似文献   

13.
The application of the exponential model is extended by the inclusion of new nonhuman primate (NHP), rabbit, and guinea pig dose‐lethality data for inhalation anthrax. Because deposition is a critical step in the initiation of inhalation anthrax, inhaled doses may not provide the most accurate cross‐species comparison. For this reason, species‐specific deposition factors were derived to translate inhaled dose to deposited dose. Four NHP, three rabbit, and two guinea pig data sets were utilized. Results from species‐specific pooling analysis suggested all four NHP data sets could be pooled into a single NHP data set, which was also true for the rabbit and guinea pig data sets. The three species‐specific pooled data sets could not be combined into a single generic mammalian data set. For inhaled dose, NHPs were the most sensitive (relative lowest LD50) species and rabbits the least. Improved inhaled LD50s proposed for use in risk assessment are 50,600, 102,600, and 70,800 inhaled spores for NHP, rabbit, and guinea pig, respectively. Lung deposition factors were estimated for each species using published deposition data from Bacillus spore exposures, particle deposition studies, and computer modeling. Deposition was estimated at 22%, 9%, and 30% of the inhaled dose for NHP, rabbit, and guinea pig, respectively. When the inhaled dose was adjusted to reflect deposited dose, the rabbit animal model appears the most sensitive with the guinea pig the least sensitive species.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号