首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 0 毫秒
1.
Risk criterion is a term that may distinguish between what is considered as an acceptable level of safety and what is not. One of the ways of determining quantitative risk criteria for temporary changes in a nuclear power plant considering probabilistic safety assessment is presented. Risk criteria are based on timing and on time duration of the change. Several examples of temporary changes in a nuclear power plant were examined to evaluate the criteria. Results show that it is possible to determine a set of risk criteria for temporary changes. Risk criteria can represent a standpoint for risk-informed decision making.  相似文献   

2.
Infrequently, it seems that a significant accident precursor or, worse, an actual accident, involving a commercial nuclear power reactor occurs to remind us of the need to reexamine the safety of this important electrical power technology from a risk perspective. Twenty‐five years since the major core damage accident at Chernobyl in the Ukraine, the Fukushima reactor complex in Japan experienced multiple core damages as a result of an earthquake‐induced tsunami beyond either the earthquake or tsunami design basis for the site. Although the tsunami itself killed tens of thousands of people and left the area devastated and virtually uninhabitable, much concern still arose from the potential radioactive releases from the damaged reactors, even though there was little population left in the area to be affected. As a lifelong probabilistic safety analyst in nuclear engineering, even I must admit to a recurrence of the doubt regarding nuclear power safety after Fukushima that I had experienced after Three Mile Island and Chernobyl. This article is my attempt to “recover” my personal perspective on acceptable risk by examining both the domestic and worldwide history of commercial nuclear power plant accidents and attempting to quantify the risk in terms of the frequency of core damage that one might glean from a review of operational history.  相似文献   

3.
《Risk analysis》2018,38(6):1107-1115
Coal combustion residuals (CCRs) are composed of various constituents, including radioactive materials. The objective of this study was to utilize methodology on radionuclide risk assessment from the Environmental Protection Agency (EPA) to estimate the potential cancer risks associated with residential exposure to CCR‐containing soil. We evaluated potential radionuclide exposure via soil ingestion, inhalation of soil particulates, and external exposure to ionizing radiation using published CCR radioactivity values for 232Th, 228Ra, 238U, and 226Ra from the Appalachia, Illinois, and Powder River coal basins. Mean and upper‐bound cancer risks were estimated individually for each radionuclide, exposure pathway, and coal basin. For each radionuclide at each coal basin, external exposure to ionizing radiation contributed the greatest to the overall risk estimate, followed by incidental ingestion of soil and inhalation of soil particulates. The mean cancer risks by route of exposure were 2.01 × 10−6 (ingestion), 6.80 × 10−9 (inhalation), and 3.66 × 10−5 (external), while the upper bound cancer risks were 3.70 × 10−6 (ingestion), 1.18 × 10−8 (inhalation), and 6.15 × 10−5 (external), using summed radionuclide‐specific data from all locations. The upper bound cancer risk from all routes of exposure was 6.52 × 10−5. These estimated cancer risks were within the EPA's acceptable cancer risk range of 1 × 10−6 to 1 × 10−4. If the CCR radioactivity values used in this analysis are generally representative of CCR waste streams, then our findings suggest that CCRs would not be expected to pose a significant radiological risk to residents living in areas where contact with CCR‐containing soils might occur.  相似文献   

4.
Microbial food safety risk assessment models can often at times be simplified by eliminating the need to integrate a complex dose‐response relationship across a distribution of exposure doses. This is possible if exposure pathways lead to pathogens at exposure that consistently have a small probability of causing illness. In this situation, the probability of illness will follow an approximately linear function of dose. Consequently, the predicted probability of illness per serving across all exposures is linear with respect to the expected value of dose. The majority of dose‐response functions are approximately linear when the dose is low. Nevertheless, what constitutes “low” is dependent on the parameters of the dose‐response function for a particular pathogen. In this study, a method is proposed to determine an upper bound of the exposure distribution for which the use of a linear dose‐response function is acceptable. If this upper bound is substantially larger than the expected value of exposure doses, then a linear approximation for probability of illness is reasonable. If conditions are appropriate for using the linear dose‐response approximation, for example, the expected value for exposure doses is two to three logs10 smaller than the upper bound of the linear portion of the dose‐response function, then predicting the risk‐reducing effectiveness of a proposed policy is trivial. Simple examples illustrate how this approximation can be used to inform policy decisions and improve an analyst's understanding of risk.  相似文献   

5.
This study tries to assess the risk of deaths and injuries from motor vehicle accidents associated with an evacuation of population groups in case of nuclear plant accidents. The risk per person–km is evaluated using: (a) data from previous evacuation: information from Soufriere evacuation (Guadeloupe Island 1976) and Mississauga (1979), added to Hans and Sell's data: no road accident occurred for a sample of 1,500,000 persons; (b) national recording system for motor vehicle accident: the rates of 2.2 10 -8 deaths per person–km and 32 10-8 injuries per person–km is calculated as an average. These last rates in France overestimate the number of casualties. A reasonable hypothesis is to assume that the probability of road accident occurrence follows a Poisson distribution, as these events are independent and unfrequent, as no accident was observed in a sample of 1,500,000 persons the probability is between 0 and an upper value of 0.24 10-8 deaths per person-km and 3.29 10-8 injuries per person–km. The average and maximum population involved within different radii around French and U.S. Nuclear power sites are taken as a sample size in order to study the total risk of deaths and injuries in the hypothesis of an evacuation being necessary to protect the populations.  相似文献   

6.
Risk‐informed decision making is often accompanied by the specification of an acceptable level of risk. Such target level is compared against the value of a risk metric, usually computed through a probabilistic safety assessment model, to decide about the acceptability of a given design, the launch of a space mission, etc. Importance measures complement the decision process with information about the risk/safety significance of events. However, importance measures do not tell us whether the occurrence of an event can change the overarching decision. By linking value of information and importance measures for probabilistic risk assessment models, this work obtains a value‐of‐information‐based importance measure that brings together the risk metric, risk importance measures, and the risk threshold in one expression. The new importance measure does not impose additional computational burden because it can be calculated from our knowledge of the risk achievement and risk reduction worth, and complements the insights delivered by these importance measures. Several properties are discussed, including the joint decision worth of basic event groups. The application to the large loss of coolant accident sequence of the Advanced Test Reactor helps us in illustrating the risk analysis insights.  相似文献   

7.
There are four operating nuclear power plant (NPP) units in Finland. The Teollisuuden Voima (TVO) power company has two 840 MWe BWR units supplied by Asea‐Atom at the Olkiluoto site. The Fortum corporation (formerly IVO) has two 500 MWe VVER 440/213 units at the Loviisa site. In addition, a 1600 MWe European Pressurized Water Reactor supplied by AREVA NP (formerly the Framatome ANP—Siemens AG Consortium) is under construction at the Olkiluoto site. Recently, the Finnish Parliament ratified the government Decision in Principle that the utilities' applications to build two new NPP units are in line with the total good of the society. The Finnish utilities, Fenno power company, and TVO company are in progress of qualifying the type of the new nuclear builds. In Finland, risk‐informed applications are formally integrated in the regulatory process of NPPs that are already in the early design phase and these are to run through the construction and operation phases all through the entire plant service time. A plant‐specific full‐scope probabilistic risk assessment (PRA) is required for each NPP. PRAs shall cover internal events, area events (fires, floods), and external events such as harsh weather conditions and seismic events in all operating modes. Special attention is devoted to the use of various risk‐informed PRA applications in the licensing of Olkiluoto 3 NPP.  相似文献   

8.
For diseases with more than one risk factor, the sum of probabilistic estimates of the number of cases caused by each individual factor may exceed the total number of cases observed, especially when uncertainties about exposure and dose response for some risk factors are high. In this study, we outline a method of bounding the fraction of lung cancer fatalities not due to specific well-studied causes. Such information serves as a "reality check" for estimates of the impacts of the minor risk factors, and, as such, complements the traditional risk analysis. With lung cancer as our example, we allocate portions of the observed lung cancer mortality to known causes (such as smoking, residential radon, and asbestos fibers) and describe the uncertainty surrounding those estimates. The interactions among the risk factors are also quantified, to the extent possible. We then infer an upper bound on the residual mortality due to "other" causes, using a consistency constraint on the total number of deaths, the maximum uncertainty principle, and the mathematics originally developed of imprecise probabilities.  相似文献   

9.
Self‐driving vehicles (SDVs) promise to considerably reduce traffic crashes. One pressing concern facing the public, automakers, and governments is “How safe is safe enough for SDVs?” To answer this question, a new expressed‐preference approach was proposed for the first time to determine the socially acceptable risk of SDVs. In our between‐subject survey (N = 499), we determined the respondents’ risk‐acceptance rate of scenarios with varying traffic‐risk frequencies to examine the logarithmic relationships between the traffic‐risk frequency and risk‐acceptance rate. Logarithmic regression models of SDVs were compared to those of human‐driven vehicles (HDVs); the results showed that SDVs were required to be safer than HDVs. Given the same traffic‐risk‐acceptance rates for SDVs and HDVs, their associated acceptable risk frequencies of SDVs and HDVs were predicted and compared. Two risk‐acceptance criteria emerged: the tolerable risk criterion, which indicates that SDVs should be four to five times as safe as HDVs, and the broadly acceptable risk criterion, which suggests that half of the respondents hoped that the traffic risk of SDVs would be two orders of magnitude lower than the current estimated traffic risk. The approach and these results could provide insights for government regulatory authorities for establishing clear safety requirements for SDVs.  相似文献   

10.
Adrian Kent 《Risk analysis》2004,24(1):157-168
Recent articles by Busza et al. (BJSW) and Dar et al. (DDH) argue that astrophysical data can be used to establish small bounds on the risk of a "killer strangelet" catastrophe scenario in the RHIC and ALICE collider experiments. The case for the safety of the experiments set out by BJSW does not rely solely on these bounds, but on theoretical arguments, which BJSW find sufficiently compelling to firmly exclude any possibility of catastrophe. Nonetheless, DDH and other commentators (initially including BJSW) suggested that these empirical bounds alone do give sufficient reassurance. This seems unsupportable when the bounds are expressed in terms of expectation value-a good measure, according to standard risk analysis arguments. For example, DDH's main bound, p(catastrophe) < 2 x 10(-8), implies only that the expectation value of the number of deaths is bounded by 120; BJSW's most conservative bound implies the expectation value of the number of deaths is bounded by 60,000. This article reappraises the DDH and BJSW risk bounds by comparing risk policy in other areas. For example, it is noted that, even if highly risk-tolerant assumptions are made and no value is placed on the lives of future generations, a catastrophe risk no higher than approximately 10(-15) per year would be required for consistency with established policy for radiation hazard risk minimization. Allowing for risk aversion and for future lives, a respectable case can be made for requiring a bound many orders of magnitude smaller. In summary, the costs of small risks of catastrophe have been significantly underestimated by BJSW (initially), by DDH, and by other commentators. Future policy on catastrophe risks would be more rational, and more deserving of public trust, if acceptable risk bounds were generally agreed upon ahead of time and if serious research on whether those bounds could indeed be guaranteed was carried out well in advance of any hypothetically risky experiment, with the relevant debates involving experts with no stake in the experiments under consideration.  相似文献   

11.
Environmental tobacco smoke (ETS)has recently been determined by U.S. environmental and occupational health authorities to be a human carcinogen. We develop a model which permits using atmospheric nicotine measurements to estimate nonsmokers’ETS lung cancer risks in individual workplaces for the first time. We estimate that during the 1980s, the U.S. nonsmoking adult population's median nicotine lung exposure (homes and workplaces combined)was 143 micrograms (μg)of nicotine daily, and that most-exposed adult nonsmokers inhaled 1430 μg/day. These exposure estimates are validated by pharmacokinetic modeling which yields the corresponding steady-state dose of the nicotine metabolite, cotinine. For U.S. adult nonsmokers of working age, we estimate median cotinine values of about 1.0 nanogram per milliliter (ng/ml)in plasma, and 6.2 ng/ml in urine; for most-exposed nonsmokers, we estimate cotinine concentrations of about 10 ng/ml in plasma and 62 ng/ml in urine. These values are consistent to within 15% of the cotinine values observed in contemporaneous clinical epidemiological studies. Corresponding median risk from ETS exposure in U.S. nonsmokers during the 1980s is estimated at about two lung cancer deaths (LCDs)per 1000 at risk, and for most-exposed nonsmokers, about two LCDs per 100. Risks abroad appear similar. Modeling of the lung cancer mortality risk from passive smoking suggests that de minimis [i.e., “acceptable” (10-6)], risk occurs at an 8-hr time-weighted-average exposure concentration of 7.5 nanograms of ETS nicotine per cubic meter of workplace air for a working lifetime of 40 years. This model is based upon a linear exposure-response relationship validated by physical, clinical, and epidemiological data. From available data, it appears that workplaces without effective smoking policies considerably exceed this de minimis risk standard. For a substantial fraction of the 59 million nonsmoking workers in the U.S., current workplace exposure to ETS also appears to pose risks exceeding the de manifestos risk level above which carcinogens are strictly regulated by the federal government.  相似文献   

12.
This paper presents the results of a study that identified how often a probabilistic risk assessment (PRA)should be updated to accommodate the changes that take place at nuclear power plants. Based on a 7-year analysis of design and procedural changes at one plant, we consider 5 years to be the maximum interval for updating PRAs. This conclusion is preliminary because it is based on the review of changes that occurred at a single plant, and it addresses only PRAs that involve a Level 1 analysis (i.e., a PRA including calculation of core damage frequency only). Nevertheless, this conclusion indicates that maintaining a useful PRA requires periodic updating efforts. However, the need for this periodic update stems only partly from the number of changes that can be expected to take place at nuclear power plants–changes that individually have only a moderate to minor impact on the PRA, but whose combined impact is substantial and necessitates a PRA update. Additionally, a comparison of two generations of PRAs performed about 5 years apart indicates that PRAs must be periodically updated to reflect the evolution of PRA methods. The most desirable updating interval depends on these two technical considerations as well as the cost of updating the PRA. (Cost considerations, however, were beyond the scope of this study.)  相似文献   

13.
Public opinion poll data have consistently shown that the proportion of respondents who are willing to have a nuclear power plant in their own community is smaller than the proportion who agree that more nuclear plants should be built in this country. Respondents' judgments of the minimum safe distance from each of eight hazardous facilities confirmed that this finding results from perceived risk gradients that differ by facility (e.g., nuclear vs. natural gas power plants) and social group (e.g., chemical engineers vs. environmentalists) but are relatively stable over time. Ratings of the facilities on thirteen perceived risk dimensions were used to determine whether any of the dimensions could explain the distance data. Because the rank order of the facilities with respect to acceptable distance was very similar to the rank order on a number of the perceived risk dimensions, it is difficult to determine which of the latter is the critical determinant of acceptable distance if, indeed, there is only one. There were, however, a number of reversals of rank order that indicate that the respondents had a differentiated view of technological risk. Finally, data from this and other studies were interpreted as suggesting that perceived lack of any other form of personal control over risk exposure may be an important factor in stimulating public opposition to the siting of hazardous facilities.  相似文献   

14.
A matrix formulation is described and numerically illustrated to calculate the public risk and identify the dominant contributors to it arising from the operation of a nuclear power plant. The matrix methodology is used as a superstructure in a probabilistic risk-assessment study to organize the calculated probabilities and to facilitate the analysis and documentation effort. The matrix structure is built to manipulate the large amount of data arising from event and fault-tree analysis and other supporting analyses. It lends itself easily to computerization and provides an analytic capability to identify dominant contributors to risk. It is a useful tool for aiding sensitivity analyses and also a potential formalism for standardization of risk-assessment studies. This tool is already used in the two recent comprehensive nuclear power plant risk-assessment efforts, the Zion and Indian Point Safety Studies.  相似文献   

15.
《Risk analysis》2018,38(6):1169-1182
Flooding in urban areas during heavy rainfall, often characterized by short duration and high‐intensity events, is known as “surface water flooding.” Analyzing surface water flood risk is complex as it requires understanding of biophysical and human factors, such as the localized scale and nature of heavy precipitation events, characteristics of the urban area affected (including detailed topography and drainage networks), and the spatial distribution of economic and social vulnerability. Climate change is recognized as having the potential to enhance the intensity and frequency of heavy rainfall events. This study develops a methodology to link high spatial resolution probabilistic projections of hourly precipitation with detailed surface water flood depth maps and characterization of urban vulnerability to estimate surface water flood risk. It incorporates probabilistic information on the range of uncertainties in future precipitation in a changing climate. The method is applied to a case study of Greater London and highlights that both the frequency and spatial extent of surface water flood events are set to increase under future climate change. The expected annual damage from surface water flooding is estimated to be to be £171 million, £343 million, and £390 million/year under the baseline, 2030 high, and 2050 high climate change scenarios, respectively.  相似文献   

16.
Rex V. Brown 《Risk analysis》2005,25(1):125-140
After two decades of massive investigation, federal approval of a nuclear waste site is drawing to a close. Large-scale research to assure that major hazards such as this one are socially acceptable is often highly inefficient, as here. A regulatory remedy would be to require not only that risk currently assessed be acceptable, but also that risk would remain acceptable given any new information. Research to test compliance with these rules has to be cost effective. Research activities should be managed according to an explicit discipline that can be imposed on powerful conflicting interests. They might be required to (1) set targets for the first- and second-order risk assessments, (2) allocate research resources to close any gap between current and target assessments cost effectively, and (3) reallocate resources, as evolving findings dictate. The interests of license applicant (Department of Energy, DOE) and society are distinguished: the former would want the application approved; the latter would want to reject an unacceptable facility.  相似文献   

17.
Accidents with automatic production systems are reported to be on the order of one in a hundred or thousand robot-years, while fatal accidents are found to occur one or two orders of magnitude less frequently. Traditions in occupational safety tend to seek for safety targets in terms of zero severe accidents for automatic systems. Decision-making requires a risk assessment balancing potential risk reduction measures and costs within the cultural environment of a production company. This paper presents a simplified procedure which acts as a decision tool. The procedure is based on a risk concept approaching prevention both in a deterministic and in a probabilistic manner. Eight accident scenarios are shown to represent the potential accident processes involving robot interactions with people. Seven prevention policies are shown to cover the accident scenarios in principle. An additional probabilistic approach may indicate which extra safety measures can be taken against what risk reduction and additional costs. The risk evaluation process aims at achieving a quantitative acceptable risk level. For that purpose, three risk evaluation methods are discussed with respect to reaching broad consensus on the safety targets.  相似文献   

18.
Over time, concerns have been raised regarding the potential for human exposure and risk from asbestos in cosmetic‐talc–containing consumer products. In 1985, the U.S. Food and Drug Administration (FDA) conducted a risk assessment evaluating the potential inhalation asbestos exposure associated with the cosmetic talc consumer use scenario of powdering an infant during diapering, and found that risks were below levels associated with background asbestos exposures and risk. However, given the scope and age of the FDA's assessment, it was unknown whether the agency's conclusions remained relevant to current risk assessment practices, talc application scenarios, and exposure data. This analysis updates the previous FDA assessment by incorporating the current published exposure literature associated with consumer use of talcum powder and using the current U.S. Environmental Protection Agency's (EPA) nonoccupational asbestos risk assessment approach to estimate potential cumulative asbestos exposure and risk for four use scenarios: (1) infant exposure during diapering; (2) adult exposure from infant diapering; (3) adult exposure from face powdering; and (4) adult exposure from body powdering. The estimated range of cumulative asbestos exposure potential for all scenarios (assuming an asbestos content of 0.1%) ranged from 0.0000021 to 0.0096 f/cc‐yr and resulted in risk estimates that were within or below EPA's acceptable target risk levels. Consistent with the original FDA findings, exposure and corresponding health risk in this range were orders of magnitude below upper‐bound estimates of cumulative asbestos exposure and risk at ambient levels, which have not been associated with increased incidence of asbestos‐related disease.  相似文献   

19.
There has been considerable discussion regarding the conservativeness of low-dose cancer risk estimates based upon linear extrapolation from upper confidence limits. Various groups have expressed a need for best (point) estimates of cancer risk in order to improve risk/benefit decisions. Point estimates of carcinogenic potency obtained from maximum likelihood estimates of low-dose slope may be highly unstable, being sensitive both to the choice of the dose–response model and possibly to minimal perturbations of the data. For carcinogens that augment background carcinogenic processes and/or for mutagenic carcinogens, at low doses the tumor incidence versus target tissue dose is expected to be linear. Pharmacokinetic data may be needed to identify and adjust for exposure-dose nonlinearities. Based on the assumption that the dose response is linear over low doses, a stable point estimate for low-dose cancer risk is proposed. Since various models give similar estimates of risk down to levels of 1%, a stable estimate of the low-dose cancer slope is provided by ŝ = 0.01/ED01, where ED01 is the dose corresponding to an excess cancer risk of 1%. Thus, low-dose estimates of cancer risk are obtained by, risk = ŝ × dose. The proposed procedure is similar to one which has been utilized in the past by the Center for Food Safety and Applied Nutrition, Food and Drug Administration. The upper confidence limit, s , corresponding to this point estimate of low-dose slope is similar to the upper limit, q 1 obtained from the generalized multistage model. The advantage of the proposed procedure is that ŝ provides stable estimates of low-dose carcinogenic potency, which are not unduly influenced by small perturbations of the tumor incidence rates, unlike 1.  相似文献   

20.
Current methods for cancer risk assessment result in single values, without any quantitative information on the uncertainties in these values. Therefore, single risk values could easily be overinterpreted. In this study, we discuss a full probabilistic cancer risk assessment approach in which all the generally recognized uncertainties in both exposure and hazard assessment are quantitatively characterized and probabilistically evaluated, resulting in a confidence interval for the final risk estimate. The methodology is applied to three example chemicals (aflatoxin, N‐nitrosodimethylamine, and methyleugenol). These examples illustrate that the uncertainty in a cancer risk estimate may be huge, making single value estimates of cancer risk meaningless. Further, a risk based on linear extrapolation tends to be lower than the upper 95% confidence limit of a probabilistic risk estimate, and in that sense it is not conservative. Our conceptual analysis showed that there are two possible basic approaches for cancer risk assessment, depending on the interpretation of the dose‐incidence data measured in animals. However, it remains unclear which of the two interpretations is the more adequate one, adding an additional uncertainty to the already huge confidence intervals for cancer risk estimates.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号