首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
A method is proposed for integrated probabilistic risk assessment where exposure assessment and hazard characterization are both included in a probabilistic way. The aim is to specify the probability that a random individual from a defined (sub)population will have an exposure high enough to cause a particular health effect of a predefined magnitude, the critical effect size ( CES ). The exposure level that results in exactly that CES in a particular person is that person's individual critical effect dose ( ICED ). Individuals in a population typically show variation, both in their individual exposure ( IEXP ) and in their ICED . Both the variation in IEXP and the variation in ICED are quantified in the form of probability distributions. Assuming independence between both distributions, they are combined (by Monte Carlo) into a distribution of the individual margin of exposure ( IMoE ). The proportion of the IMoE distribution below unity is the probability of critical exposure ( PoCE ) in the particular (sub)population. Uncertainties involved in the overall risk assessment (i.e., both regarding exposure and effect assessment) are quantified using Monte Carlo and bootstrap methods. This results in an uncertainty distribution for any statistic of interest, such as the probability of critical exposure ( PoCE ). The method is illustrated based on data for the case of dietary exposure to the organophosphate acephate. We present plots that concisely summarize the probabilistic results, retaining the distinction between variability and uncertainty. We show how the relative contributions from the various sources of uncertainty involved may be quantified.  相似文献   

2.
《Risk analysis》2018,38(6):1223-1238
Implementation of probabilistic analyses in exposure assessment can provide valuable insight into the risks of those at the extremes of population distributions, including more vulnerable or sensitive subgroups. Incorporation of these analyses into current regulatory methods for occupational pesticide exposure is enabled by the exposure data sets and associated data currently used in the risk assessment approach of the Environmental Protection Agency (EPA). Monte Carlo simulations were performed on exposure measurements from the Agricultural Handler Exposure Database and the Pesticide Handler Exposure Database along with data from the Exposure Factors Handbook and other sources to calculate exposure rates for three different neurotoxic compounds (azinphos methyl, acetamiprid, emamectin benzoate) across four pesticide‐handling scenarios. Probabilistic estimates of doses were compared with the no observable effect levels used in the EPA occupational risk assessments. Some percentage of workers were predicted to exceed the level of concern for all three compounds: 54% for azinphos methyl, 5% for acetamiprid, and 20% for emamectin benzoate. This finding has implications for pesticide risk assessment and offers an alternative procedure that may be more protective of those at the extremes of exposure than the current approach.  相似文献   

3.
This paper presents the results of a study that identified how often a probabilistic risk assessment (PRA)should be updated to accommodate the changes that take place at nuclear power plants. Based on a 7-year analysis of design and procedural changes at one plant, we consider 5 years to be the maximum interval for updating PRAs. This conclusion is preliminary because it is based on the review of changes that occurred at a single plant, and it addresses only PRAs that involve a Level 1 analysis (i.e., a PRA including calculation of core damage frequency only). Nevertheless, this conclusion indicates that maintaining a useful PRA requires periodic updating efforts. However, the need for this periodic update stems only partly from the number of changes that can be expected to take place at nuclear power plants–changes that individually have only a moderate to minor impact on the PRA, but whose combined impact is substantial and necessitates a PRA update. Additionally, a comparison of two generations of PRAs performed about 5 years apart indicates that PRAs must be periodically updated to reflect the evolution of PRA methods. The most desirable updating interval depends on these two technical considerations as well as the cost of updating the PRA. (Cost considerations, however, were beyond the scope of this study.)  相似文献   

4.
Probabilistic risk assessments are enjoying increasing popularity as a tool to characterize the health hazards associated with exposure to chemicals in the environment. Because probabilistic analyses provide much more information to the risk manager than standard “point” risk estimates, this approach has generally been heralded as one which could significantly improve the conduct of health risk assessments. The primary obstacles to replacing point estimates with probabilistic techniques include a general lack of familiarity with the approach and a lack of regulatory policy and guidance. This paper discusses some of the advantages and disadvantages of the point estimate vs. probabilistic approach. Three case studies are presented which contrast and compare the results of each. The first addresses the risks associated with household exposure to volatile chemicals in tapwater. The second evaluates airborne dioxin emissions which can enter the food-chain. The third illustrates how to derive health-based cleanup levels for dioxin in soil. It is shown that, based on the results of Monte Carlo analyses of probability density functions (PDFs), the point estimate approach required by most regulatory agencies will nearly always overpredict the risk for the 95th percentile person by a factor of up to 5. When the assessment requires consideration of 10 or more exposure variables, the point estimate approach will often predict risks representative of the 99.9th percentile person rather than the 50th or 95th percentile person. This paper recommends a number of data distributions for various exposure variables that we believe are now sufficiently well understood to be used with confidence in most exposure assessments. A list of exposure variables that may require additional research before adequate data distributions can be developed are also discussed.  相似文献   

5.
Human variability is a very important factor considered in human health risk assessment for protecting sensitive populations from chemical exposure. Traditionally, to account for this variability, an interhuman uncertainty factor is applied to lower the exposure limit. However, using a fixed uncertainty factor rather than probabilistically accounting for human variability can hardly support probabilistic risk assessment advocated by a number of researchers; new methods are needed to probabilistically quantify human population variability. We propose a Bayesian hierarchical model to quantify variability among different populations. This approach jointly characterizes the distribution of risk at background exposure and the sensitivity of response to exposure, which are commonly represented by model parameters. We demonstrate, through both an application to real data and a simulation study, that using the proposed hierarchical structure adequately characterizes variability across different populations.  相似文献   

6.
Roy L. Smith 《Risk analysis》1994,14(4):433-439
This work presents a comparison of probabilistic and deterministic health risk estimates based on data from an industrial site in the northeastern United States. The risk assessment considered exposures to volatile solvents by drinking water ingestion and showering. Probability densities used as inputs included concentrations, contact rates, and exposure frequencies; dose-response inputs were single values. Deterministic risk estimates were calculated by the "reasonable maximum exposure" (RME) approach recommended by the EPA Superfund program. The RME non-carcinogenic risk fell between the 90th and the 95th percentile of the probability density; the RME cancer risk fell between the 95th percentile and the maximum. These results suggest that in this case (1) EPA's deterministic RME risk was reasonably protective, (2) results of probabilistic and deterministic calculations were consistent, and (3) commercially available software Monte Carlo software effectively provided multiple risk estimates recommended by recent EPA guidance.  相似文献   

7.
Risks from exposure to contaminated land are often assessed with the aid of mathematical models. The current probabilistic approach is a considerable improvement on previous deterministic risk assessment practices, in that it attempts to characterize uncertainty and variability. However, some inputs continue to be assigned as precise numbers, while others are characterized as precise probability distributions. Such precision is hard to justify, and we show in this article how rounding errors and distribution assumptions can affect an exposure assessment. The outcome of traditional deterministic point estimates and Monte Carlo simulations were compared to probability bounds analyses. Assigning all scalars as imprecise numbers (intervals prescribed by significant digits) added uncertainty to the deterministic point estimate of about one order of magnitude. Similarly, representing probability distributions as probability boxes added several orders of magnitude to the uncertainty of the probabilistic estimate. This indicates that the size of the uncertainty in such assessments is actually much greater than currently reported. The article suggests that full disclosure of the uncertainty may facilitate decision making in opening up a negotiation window. In the risk analysis process, it is also an ethical obligation to clarify the boundary between the scientific and social domains.  相似文献   

8.
Nearly ten years have passed since the publication in August 1974 of the draft Reactor Safety Study (WASH 1400), the first detailed attempt to apply probabilistic risk assessment (PRA) techniques to estimate the public risks posed by commercial nuclear power plants. Now is an opportune time to look back and see how PRA has fared over these ten years. We will not attempt to pass judgement on how the Reactor Safety Study report itself has withstood the test of time, as that task is best left to others less directly involved in preparing the report. Instead, we will examine advances in the understanding, acceptance, and utilization of PRA techniques, as well as technical advances in PRA methods. Some of the significant insights gained from PRAs will be discussed. Finally, some observations on the future of PRA will be offered.  相似文献   

9.
Ethylene oxide is a gas produced in large quantities in the United States that is used primarily as a chemical intermediate in the production of ethylene glycol, propylene glycol, non-ionic surfactants, ethanolamines, glycol ethers, and other chemicals. It has been well established that ethylene oxide can induce cancer, genetic, reproductive and developmental, and acute health effects in animals. The U.S. Environmental Protection Agency is currently developing both a cancer potency factor and a reference concentration (RfC) for ethylene oxide. This study used the rich database on the reproductive and developmental effects of ethylene oxide to develop a probabilistic characterization of possible regulatory thresholds for ethylene oxide. This analysis was based on the standard regulatory approach for noncancer risk assessment, but involved several innovative elements, such as: (1) the use of advanced statistical methods to account for correlations in developmental outcomes among littermates and allow for simultaneous control of covariates (such as litter size); (2) the application of a probabilistic approach for characterizing the uncertainty in extrapolating the animal results to humans; and (3) the use of a quantitative approach to account for the variation in heterogeneity among the human population. This article presents several classes of results, including: (1) probabilistic characterizations of ED10s for two quantal reproductive outcomes-resorption and fetal death, (2) probabilistic characterizations of one developmental outcome-the dose expected to yield a 5% reduction in fetal (or pup) weight, (3) estimates of the RfCs that would result from using these values in the standard regulatory approach for noncancer risk assessment, and (4) a probabilistic characterization of the level of ethylene oxide exposure that would be expected to yield a 1/1,000 increase in the risk of reproductive or developmental outcomes in exposed human populations.  相似文献   

10.
Twenty-four-hour recall data from the Continuing Survey of Food Intake by Individuals (CSFII) are frequently used to estimate dietary exposure for risk assessment. Food frequency questionnaires are traditional instruments of epidemiological research; however, their application in dietary exposure and risk assessment has been limited. This article presents a probabilistic method of bridging the National Health and Nutrition Examination Survey (NHANES) food frequency and the CSFII data to estimate longitudinal (usual) intake, using a case study of seafood mercury exposures for two population subgroups (females 16 to 49 years and children 1 to 5 years). Two hundred forty-nine CSFII food codes were mapped into 28 NHANES fish/shellfish categories. FDA and state/local seafood mercury data were used. A uniform distribution with minimum and maximum blood-diet ratios of 0.66 to 1.07 was assumed. A probabilistic assessment was conducted to estimate distributions of individual 30-day average daily fish/shellfish intakes, methyl mercury exposure, and blood levels. The upper percentile estimates of fish and shellfish intakes based on the 30-day daily averages were lower than those based on two- and three-day daily averages. These results support previous findings that distributions of "usual" intakes based on a small number of consumption days provide overestimates in the upper percentiles. About 10% of the females (16 to 49 years) and children (1 to 5 years) may be exposed to mercury levels above the EPA's RfD. The predicted 75th and 90th percentile blood mercury levels for the females in the 16-to-49-year group were similar to those reported by NHANES. The predicted 90th percentile blood mercury levels for children in the 1-to-5-year subgroup was similar to NHANES and the 75th percentile estimates were slightly above the NHANES.  相似文献   

11.
There are many uncertainties in a probabilistic risk analysis (PRA). We identify the different types of uncertainties and describe their implications. We then summarize the uncertainty analyses which have performed in current PRAs and characterize results which have been obtained. We draw conclusions regarding interpretations of uncertainties, areas having largest uncertainties, and needs which exist in uncertainty analysis. We finally characterize the robustness of various utilizations of PRA results.  相似文献   

12.
Moolgavkar  Suresh H.  Luebeck  E. Georg  Turim  Jay  Hanna  Linda 《Risk analysis》1999,19(4):599-611
We present the results of a quantitative assessment of the lung cancer risk associated with occupational exposure to refractory ceramic fibers (RCF). The primary sources of data for our risk assessment were two long-term oncogenicity studies in male Fischer rats conducted to assess the potential pathogenic effects associated with prolonged inhalation of RCF. An interesting feature of the data was the availability of the temporal profile of fiber burden in the lungs of experimental animals. Because of this information, we were able to conduct both exposure–response and dose–response analyses. Our risk assessment was conducted within the framework of a biologically based model for carcinogenesis, the two-stage clonal expansion model, which allows for the explicit incorporation of the concepts of initiation and promotion in the analyses. We found that a model positing that RCF was an initiator had the highest likelihood. We proposed an approach based on biological considerations for the extrapolation of risk to humans. This approach requires estimation of human lung burdens for specific exposure scenarios, which we did by using an extension of a model due to Yu. Our approach acknowledges that the risk associated with exposure to RCF depends on exposure to other lung carcinogens. We present estimates of risk in two populations: (1) a population of nonsmokers and (2) an occupational cohort of steelworkers not exposed to coke oven emissions, a mixed population that includes both smokers and nonsmokers.  相似文献   

13.
Recently, we developed a GIS-Integrated Integral Risk Index (IRI) to assess human health risks in areas with presence of environmental pollutants. Contaminants were previously ranked by applying a self-organizing map (SOM) to their characteristics of persistence, bioaccumulation, and toxicity in order to obtain the Hazard Index (HI). In the present study, the original IRI was substantially improved by allowing the entrance of probabilistic data. A neuroprobabilistic HI was developed by combining SOM and Monte Carlo analysis. In general terms, the deterministic and probabilistic HIs followed a similar pattern: polychlorinated biphenyls (PCBs) and light polycyclic aromatic hydrocarbons (PAHs) were the pollutants showing the highest and lowest values of HI, respectively. However, the bioaccumulation value of heavy metals notably increased after considering a probability density function to explain the bioaccumulation factor. To check its applicability, a case study was investigated. The probabilistic integral risk was calculated in the chemical/petrochemical industrial area of Tarragona (Catalonia, Spain), where an environmental program has been carried out since 2002. The risk change between 2002 and 2005 was evaluated on the basis of probabilistic data of the levels of various pollutants in soils. The results indicated that the risk of the chemicals under study did not follow a homogeneous tendency. However, the current levels of pollution do not mean a relevant source of health risks for the local population. Moreover, the neuroprobabilistic HI seems to be an adequate tool to be taken into account in risk assessment processes.  相似文献   

14.
Biomagnification of organochlorine and other persistent organic contaminants by higher trophic level organisms represents one of the most significant sources of uncertainty and variability in evaluating potential risks associated with disposal of dredged materials. While it is important to distinguish between population variability (e.g., true population heterogeneity in fish weight, and lipid content) and uncertainty (e.g., measurement error), they can be operationally difficult to define separately in probabilistic estimates of human health and ecological risk. We propose a disaggregation of uncertain and variable parameters based on: (1) availability of supporting data; (2) the specific management and regulatory context (in this case, of the U.S. Army Corps of Engineers/U.S. Environmental Protection Agency tiered approach to dredged material management); and (3) professional judgment and experience in conducting probabilistic risk assessments. We describe and quantitatively evaluate several sources of uncertainty and variability in estimating risk to human health from trophic transfer of polychlorinated biphenyls (PCBs) using a case study of sediments obtained from the New York-New Jersey Harbor and being evaluated for disposal at an open water off-shore disposal site within the northeast region. The estimates of PCB concentrations in fish and dietary doses of PCBs to humans ingesting fish are expressed as distributions of values, of which the arithmetic mean or mode represents a particular fractile. The distribution of risk values is obtained using a food chain biomagnification model developed by Gobas by specifying distributions for input parameters disaggregated to represent either uncertainty or variability. Only those sources of uncertainty that could be quantified were included in the analysis. Results for several different two-dimensional Latin Hypercube analyses are provided to evaluate the influence of the uncertain versus variable disaggregation of model parameters. The analysis suggests that variability in human exposure parameters is greater than the uncertainty bounds on any particular fractile, given the described assumptions.  相似文献   

15.
For several years machine learning methods have been proposed for risk classification. While machine learning methods have also been used for failure diagnosis and condition monitoring, to the best of our knowledge, these methods have not been used for probabilistic risk assessment. Probabilistic risk assessment is a subjective process. The problem of how well machine learning methods can emulate expert judgments is challenging. Expert judgments are based on mental shortcuts, heuristics, which are susceptible to biases. This paper presents a process for developing natural language-based probabilistic risk assessment models, applying deep learning algorithms to emulate experts’ quantified risk estimates. This allows the risk analyst to obtain an a priori risk assessment when there is limited information in the form of text and numeric data. Universal sentence embedding (USE) with gradient boosting regression (GBR) trees trained over limited structured data presented the most promising results. When we apply these models’ outputs to generate survival distributions for autonomous systems’ likelihood of loss with distance, we observe that for open water and ice shelf operating environments, the differences between the survival distributions generated by the machine learning algorithm and those generated by the experts are not statistically significant.  相似文献   

16.
The dose–response relationship between folate levels and cognitive impairment among individuals with vitamin B12 deficiency is an essential component of a risk-benefit analysis approach to regulatory and policy recommendations regarding folic acid fortification. Epidemiological studies provide data that are potentially useful for addressing this research question, but the lack of analysis and reporting of data in a manner suitable for dose–response purposes hinders the application of the traditional evidence synthesis process. This study aimed to estimate a quantitative dose–response relationship between folate exposure and the risk of cognitive impairment among older adults with vitamin B12 deficiency using “probabilistic meta-analysis,” a novel approach for synthesizing data from observational studies. Second-order multistage regression was identified as the best-fit model for the association between the probability of cognitive impairment and serum folate levels based on data generated by randomly sampling probabilistic distributions with parameters estimated based on summarized information reported in relevant publications. The findings indicate a “J-shape” effect of serum folate levels on the occurrence of cognitive impairment. In particular, an excessive level of folate exposure is predicted to be associated with a higher risk of cognitive impairment, albeit with greater uncertainty than the association between low folate exposure and cognitive impairment. This study directly contributes to the development of a practical solution to synthesize observational evidence for dose–response assessment purposes, which will help strengthen future nutritional risk assessments for the purpose of informing decisions on nutrient fortification in food.  相似文献   

17.
Management of invasive species depends on developing prevention and control strategies through comprehensive risk assessment frameworks that need a thorough analysis of exposure to invasive species. However, accurate exposure analysis of invasive species can be a daunting task because of the inherent uncertainty in invasion processes. Risk assessment of invasive species under uncertainty requires potential integration of expert judgment with empirical information, which often can be incomplete, imprecise, and fragmentary. The representation of knowledge in classical risk models depends on the formulation of a precise probabilistic value or well-defined joint distribution of unknown parameters. However, expert knowledge and judgments are often represented in value-laden terms or preference-ordered criteria. We offer a novel approach to risk assessment by using a dominance-based rough set approach to account for preference order in the domains of attributes in the set of risk classes. The model is illustrated with an example showing how a knowledge-centric risk model can be integrated with the dominance-based principle of rough set to derive minimal covering "if ... , then...," decision rules to reason over a set of possible invasion scenarios. The inconsistency and ambiguity in the data set is modeled using the rough set concept of boundary region adjoining lower and upper approximation of risk classes. Finally, we present an extension of rough set to evidence a theoretic interpretation of risk measures of invasive species in a spatial context. In this approach, the multispecies interactions in an invasion risk are approximated with imprecise probability measures through a combination of spatial neighborhood information of risk estimation in terms of belief and plausibility.  相似文献   

18.
Pesticide risk assessment for food products involves combining information from consumption and concentration data sets to estimate a distribution for the pesticide intake in a human population. Using this distribution one can obtain probabilities of individuals exceeding specified levels of pesticide intake. In this article, we present a probabilistic, Bayesian approach to modeling the daily consumptions of the pesticide Iprodione though multiple food products. Modeling data on food consumption and pesticide concentration poses a variety of problems, such as the large proportions of consumptions and concentrations that are recorded as zero, and correlation between the consumptions of different foods. We consider daily food consumption data from the Netherlands National Food Consumption Survey and concentration data collected by the Netherlands Ministry of Agriculture. We develop a multivariate latent‐Gaussian model for the consumption data that allows for correlated intakes between products. For the concentration data, we propose a univariate latent‐t model. We then combine predicted consumptions and concentrations from these models to obtain a distribution for individual daily Iprodione exposure. The latent‐variable models allow for both skewness and large numbers of zeros in the consumption and concentration data. The use of a probabilistic approach is intended to yield more robust estimates of high percentiles of the exposure distribution than an empirical approach. Bayesian inference is used to facilitate the treatment of data with a complex structure.  相似文献   

19.
Current methods for cancer risk assessment result in single values, without any quantitative information on the uncertainties in these values. Therefore, single risk values could easily be overinterpreted. In this study, we discuss a full probabilistic cancer risk assessment approach in which all the generally recognized uncertainties in both exposure and hazard assessment are quantitatively characterized and probabilistically evaluated, resulting in a confidence interval for the final risk estimate. The methodology is applied to three example chemicals (aflatoxin, N‐nitrosodimethylamine, and methyleugenol). These examples illustrate that the uncertainty in a cancer risk estimate may be huge, making single value estimates of cancer risk meaningless. Further, a risk based on linear extrapolation tends to be lower than the upper 95% confidence limit of a probabilistic risk estimate, and in that sense it is not conservative. Our conceptual analysis showed that there are two possible basic approaches for cancer risk assessment, depending on the interpretation of the dose‐incidence data measured in animals. However, it remains unclear which of the two interpretations is the more adequate one, adding an additional uncertainty to the already huge confidence intervals for cancer risk estimates.  相似文献   

20.
Environmental tobacco smoke (ETS) is a major contributor to indoor human exposures to fine particulate matter of 2.5 μm or smaller (PM2.5). The Stochastic Human Exposure and Dose Simulation for Particulate Matter (SHEDS‐PM) Model developed by the U.S. Environmental Protection Agency estimates distributions of outdoor and indoor PM2.5 exposure for a specified population based on ambient concentrations and indoor emissions sources. A critical assessment was conducted of the methodology and data used in SHEDS‐PM for estimation of indoor exposure to ETS. For the residential microenvironment, SHEDS uses a mass‐balance approach, which is comparable to best practices. The default inputs in SHEDS‐PM were reviewed and more recent and extensive data sources were identified. Sensitivity analysis was used to determine which inputs should be prioritized for updating. Data regarding the proportion of smokers and “other smokers” and cigarette emission rate were found to be important. SHEDS‐PM does not currently account for in‐vehicle ETS exposure; however, in‐vehicle ETS‐related PM2.5 levels can exceed those in residential microenvironments by a factor of 10 or more. Therefore, a mass‐balance‐based methodology for estimating in‐vehicle ETS PM2.5 concentration is evaluated. Recommendations are made regarding updating of input data and algorithms related to ETS exposure in the SHEDS‐PM model. Interindividual variability for ETS exposure was quantified. Geographic variability in ETS exposure was quantified based on the varying prevalence of smokers in five selected locations in the United States.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号