首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
In quantitative uncertainty analysis, it is essential to define rigorously the endpoint or target of the assessment. Two distinctly different approaches using Monte Carlo methods are discussed: (1) the end point is a fixed but unknown value (e.g., the maximally exposed individual, the average individual, or a specific individual) or (2) the end point is an unknown distribution of values (e.g., the variability of exposures among unspecified individuals in the population). In the first case, values are sampled at random from distributions representing various "degrees of belief" about the unknown "fixed" values of the parameters to produce a distribution of model results. The distribution of model results represents a subjective confidence statement about the true but unknown assessment end point. The important input parameters are those that contribute most to the spread in the distribution of the model results. In the second case, Monte Carlo calculations are performed in two dimensions producing numerous alternative representations of the true but unknown distribution. These alternative distributions permit subject confidence statements to be made from two perspectives: (1) for the individual exposure occurring at a specified fractile of the distribution or (2) for the fractile of the distribution associated with a specified level of individual exposure. The relative importance of input parameters will depend on the fractile or exposure level of interest. The quantification of uncertainty for the simulation of a true but unknown distribution of values represents the state-of-the-art in assessment modeling.  相似文献   

2.
The tenfold "uncertainty" factor traditionally used to guard against human interindividual differences in susceptibility to toxicity is not based on human observations. To begin to build a basis for quantifying an important component of overall variability in susceptibility to toxicity, a data base has been constructed of individual measurements of key pharmacokinetic parameters for specific substances (mostly drugs) in groups of at least five healthy adults. 72 of the 101 data sets studied were positively skewed, indicating that the distributions are generally closer to expectations for log-normal distributions than for normal distributions. Measurements of interindividual variability in elimination half-lives, maximal blood concentrations, and AUC (area under the curve of blood concentration by time) have median values of log10 geometric standard deviations in the range of 0.11-0.145. For the median chemical, therefore, a tenfold difference in these pharmacokinetic parameters would correspond to 7-9 standard deviations in populations of normal healthy adults. For one relatively lipophilic chemical, however, interindividual variability in maximal blood concentration and AUC was 0.4--implying that a tenfold difference would correspond to only about 2.5 standard deviations for those parameters in the human population. The parameters studied to date are only components of overall susceptibility to toxic agents, and do not include contributions from variability in exposure- and response-determining parameters. The current study also implicitly excludes most human interindividual variability from age and illness. When these other sources of variability are included in an overall analysis of variability in susceptibility, it is likely that a tenfold difference will correspond to fewer standard deviations in the overall population, and correspondingly greater numbers of people at risk of toxicity.  相似文献   

3.
The objective of this study is the estimation of health hazards due to the inhalation of combustion products from accidental mineral oil transformer fires. Calculations of production, dispersion, and subsequent human intake of polychlorinated dibenzofurans (PCDFs) provide us with exposure estimates. PCDFs are believed to be the principal toxic products of the pyrolysis of polychlorinated biphenyls (PCBs) sometimes found as contaminants in transformer mineral oil. Cancer burdens and birth defect hazard indices are estimated from population data and exposure statistics. Monte Carlo-derived variational factors emphasize the statistics of uncertainty in the estimates of risk parameters. Community health issues are addressed and risks are found to be insignificant.  相似文献   

4.
In any model the values of estimates for various parameters are obtained from different sources each with its own level of uncertainty. When the probability distributions of the estimates are obtained as opposed to point values only, the measurement uncertainties in the parameter estimates may be addressed. However, the sources used for obtaining the data and the models used to select appropriate distributions are of differing degrees of uncertainty. A hierarchy of different sources of uncertainty based upon one's ability to validate data and models empirically is presented. When model parameters are aggregated with different levels of the hierarchy represented, this implies distortion or degradation in the utility and validity of the models used. Means to identify and deal with such heterogeneous data sources are explored, and a number of approaches to addressing this problem is presented. One approach, using Range/Confidence Estimates coupled with an Information Value Analysis Process, is presented as an example.  相似文献   

5.
Benzene is myelotoxic and leukemogenic in humans exposed at high doses (>1 ppm, more definitely above 10 ppm) for extended periods. However, leukemia risks at lower exposures are uncertain. Benzene occurs widely in the work environment and also indoor air, but mostly below 1 ppm, so assessing the leukemia risks at these low concentrations is important. Here, we describe a human physiologically-based pharmacokinetic (PBPK) model that quantifies tissue doses of benzene and its key metabolites, benzene oxide, phenol, and hydroquinone after inhalation and oral exposures. The model was integrated into a statistical framework that acknowledges sources of variation due to inherent intra- and interindividual variation, measurement error, and other data collection issues. A primary contribution of this work is the estimation of population distributions of key PBPK model parameters. We hypothesized that observed interindividual variability in the dosimetry of benzene and its metabolites resulted primarily from known or estimated variability in key metabolic parameters and that a statistical PBPK model that explicitly included variability in only those metabolic parameters would sufficiently describe the observed variability. We then identified parameter distributions for the PBPK model to characterize observed variability through the use of Markov chain Monte Carlo analysis applied to two data sets. The identified parameter distributions described most of the observed variability, but variability in physiological parameters such as organ weights may also be helpful to faithfully predict the observed human-population variability in benzene dosimetry.  相似文献   

6.
Retailers often face a newsvendor problem. Advance selling helps retailers to reduce demand uncertainty. Consumers, however, may prefer not to purchase in advance unless given a discount because they are uncertain about their valuation for the product in advance. It is then unclear whether or when advance selling to pass some uncertainty risk to consumers is optimal for the retailer. This paper examines the advance selling price and inventory decisions in a two‐period setting, where the first period is the advance selling period and the second is the selling (and consumption) period. We find that an advance selling strategy is not always optimal, but is contingent on parameters of the market (e.g., market potential and uncertainty) and the consumers (e.g., valuation, risk aversion, and heterogeneity). For example, we find that retailers should sell in advance if the consumers' expected valuation exceeds consumers' expected surplus when not buying early by a certain threshold. This threshold increases with the degree of risk aversion but decreases with stock out risk. If the degree of risk aversion varies across consumers, then a retailer should sell in advance if the probability for a consumer to spot buy is less than a critical fractile.  相似文献   

7.
The human toxicity potential, a weighting scheme used to evaluate toxic emissions for life cycle assessment and toxics release inventories, is based on potential dose calculations and toxicity factors. This paper evaluates the variance in potential dose calculations that can be attributed to the uncertainty in chemical-specific input parameters as well as the variability in exposure factors and landscape parameters. A knowledge of the uncertainty allows us to assess the robustness of a decision based on the toxicity potential; a knowledge of the sources of uncertainty allows us to focus our resources if we want to reduce the uncertainty. The potential dose of 236 chemicals was assessed. The chemicals were grouped by dominant exposure route, and a Monte Carlo analysis was conducted for one representative chemical in each group. The variance is typically one to two orders of magnitude. For comparison, the point estimates in potential dose for 236 chemicals span ten orders of magnitude. Most of the variance in the potential dose is due to chemical-specific input parameters, especially half-lives, although exposure factors such as fish intake and the source of drinking water can be important for chemicals whose dominant exposure is through indirect routes. Landscape characteristics are generally of minor importance.  相似文献   

8.
Variability is the heterogeneity of values within a population. Uncertainty refers to lack of knowledge regarding the true value of a quantity. Mixture distributions have the potential to improve the goodness of fit to data sets not adequately described by a single parametric distribution. Uncertainty due to random sampling error in statistics of interests can be estimated based upon bootstrap simulation. In order to evaluate the robustness of using mixture distribution as a basis for estimating both variability and uncertainty, 108 synthetic data sets generated from selected population mixture log-normal distributions were investigated, and properties of variability and uncertainty estimates were evaluated with respect to variation in sample size, mixing weight, and separation between components of mixtures. Furthermore, mixture distributions were compared with single-component distributions. Findings include: (1). mixing weight influences the stability of variability and uncertainty estimates; (2). bootstrap simulation results tend to be more stable for larger sample sizes; (3). when two components are well separated, the stability of bootstrap simulation is improved; however, a larger degree of uncertainty arises regarding the percentiles coinciding with the separated region; (4). when two components are not well separated, a single distribution may often be a better choice because it has fewer parameters and better numerical stability; and (5). dependencies exist in sampling distributions of parameters of mixtures and are influenced by the amount of separation between the components. An emission factor case study based upon NO(x) emissions from coal-fired tangential boilers is used to illustrate the application of the approach.  相似文献   

9.
Traditionally, microbial risk assessors have used point estimates to evaluate the probability that an individual will become infected. We developed a quantitative approach that shifts the risk characterization perspective from point estimate to distributional estimate, and from individual to population. To this end, we first designed and implemented a dynamic model that tracks traditional epidemiological variables such as the number of susceptible, infected, diseased, and immune, and environmental variables such as pathogen density. Second, we used a simulation methodology that explicitly acknowledges the uncertainty and variability associated with the data. Specifically, the approach consists of assigning probability distributions to each parameter, sampling from these distributions for Monte Carlo simulations, and using a binary classification to assess the output of each simulation. A case study is presented that explores the uncertainties in assessing the risk of giardiasis when swimming in a recreational impoundment using reclaimed water. Using literature-based information to assign parameters ranges, our analysis demonstrated that the parameter describing the shedding of pathogens by infected swimmers was the factor that contributed most to the uncertainty in risk. The importance of other parameters was dependent on reducing the a priori range of this shedding parameter. By constraining the shedding parameter to its lower subrange, treatment efficiency was the parameter most important in predicting whether a simulation resulted in prevalences above or below non outbreak levels. Whereas parameters associated with human exposure were important when the shedding parameter was constrained to a higher subrange. This Monte Carlo simulation technique identified conditions in which outbreaks and/or nonoutbreaks are likely and identified the parameters that most contributed to the uncertainty associated with a risk prediction.  相似文献   

10.
Human health risk assessments use point values to develop risk estimates and thus impart a deterministic character to risk, which, by definition, is a probability phenomenon. The risk estimates are calculated based on individuals and then, using uncertainty factors (UFs), are extrapolated to the population that is characterized by variability. Regulatory agencies have recommended the quantification of the impact of variability in risk assessments through the application of probabilistic methods. In the present study, a framework that deals with the quantitative analysis of uncertainty (U) and variability (V) in target tissue dose in the population was developed by applying probabilistic analysis to physiologically-based toxicokinetic models. The mechanistic parameters that determine kinetics were described with probability density functions (PDFs). Since each PDF depicts the frequency of occurrence of all expected values of each parameter in the population, the combined effects of multiple sources of U/V were accounted for in the estimated distribution of tissue dose in the population, and a unified (adult and child) intraspecies toxicokinetic uncertainty factor UFH-TK was determined. The results show that the proposed framework accounts effectively for U/V in population toxicokinetics. The ratio of the 95th percentile to the 50th percentile of the annual average concentration of the chemical at the target tissue organ (i.e., the UFH-TK) varies with age. The ratio is equivalent to a unified intraspecies toxicokinetic UF, and it is one of the UFs by which the NOAEL can be divided to obtain the RfC/RfD. The 10-fold intraspecies UF is intended to account for uncertainty and variability in toxicokinetics (3.2x) and toxicodynamics (3.2x). This article deals exclusively with toxicokinetic component of UF. The framework provides an alternative to the default methodology and is advantageous in that the evaluation of toxicokinetic variability is based on the distribution of the effective target tissue dose, rather than applied dose. It allows for the replacement of the default adult and children intraspecies UF with toxicokinetic data-derived values and provides accurate chemical-specific estimates for their magnitude. It shows that proper application of probability and toxicokinetic theories can reduce uncertainties when establishing exposure limits for specific compounds and provide better assurance that established limits are adequately protective. It contributes to the development of a probabilistic noncancer risk assessment framework and will ultimately lead to the unification of cancer and noncancer risk assessment methodologies.  相似文献   

11.
Variability and Uncertainty Meet Risk Management and Risk Communication   总被引:1,自引:0,他引:1  
In the past decade, the use of probabilistic risk analysis techniques to quantitatively address variability and uncertainty in risks increased in popularity as recommended by the 1994 National Research Council that wrote Science and Judgment in Risk Assessment. Under the 1996 Food Quality Protection Act, for example, the U.S. EPA supported the development of tools that produce distributions of risk demonstrating the variability and/or uncertainty in the results. This paradigm shift away from the use of point estimates creates new challenges for risk managers, who now struggle with decisions about how to use distributions in decision making. The challenges for risk communication, however, have only been minimally explored. This presentation uses the case studies of variability in the risks of dying on the ground from a crashing airplane and from the deployment of motor vehicle airbags to demonstrate how better characterization of variability and uncertainty in the risk assessment lead to better risk communication. Analogies to food safety and environmental risks are also discussed. This presentation demonstrates that probabilistic risk assessment has an impact on both risk management and risk communication, and highlights remaining research issues associated with using improved sensitivity and uncertainty analyses in risk assessment.  相似文献   

12.
Jan F. Van Impe 《Risk analysis》2011,31(8):1295-1307
The aim of quantitative microbiological risk assessment is to estimate the risk of illness caused by the presence of a pathogen in a food type, and to study the impact of interventions. Because of inherent variability and uncertainty, risk assessments are generally conducted stochastically, and if possible it is advised to characterize variability separately from uncertainty. Sensitivity analysis allows to indicate to which of the input variables the outcome of a quantitative microbiological risk assessment is most sensitive. Although a number of methods exist to apply sensitivity analysis to a risk assessment with probabilistic input variables (such as contamination, storage temperature, storage duration, etc.), it is challenging to perform sensitivity analysis in the case where a risk assessment includes a separate characterization of variability and uncertainty of input variables. A procedure is proposed that focuses on the relation between risk estimates obtained by Monte Carlo simulation and the location of pseudo‐randomly sampled input variables within the uncertainty and variability distributions. Within this procedure, two methods are used—that is, an ANOVA‐like model and Sobol sensitivity indices—to obtain and compare the impact of variability and of uncertainty of all input variables, and of model uncertainty and scenario uncertainty. As a case study, this methodology is applied to a risk assessment to estimate the risk of contracting listeriosis due to consumption of deli meats.  相似文献   

13.
A screening approach is developed for volatile organic compounds (VOCs) to estimate exposures that correspond to levels measured in fluids and/or tissues in human biomonitoring studies. The approach makes use of a generic physiologically-based pharmacokinetic (PBPK) model coupled with exposure pattern characterization, Monte Carlo analysis, and quantitative structure property relationships (QSPRs). QSPRs are used for VOCs with minimal data to develop chemical-specific parameters needed for the PBPK model. The PBPK model is capable of simulating VOC kinetics following multiple routes of exposure, such as oral exposure via water ingestion and inhalation exposure during shower events. Using published human biomonitoring data of trichloroethylene (TCE), the generic model is evaluated to determine how well it estimates TCE concentrations in blood based on the known drinking water concentrations. In addition, Monte Carlo analysis is conducted to characterize the impact of the following factors: (1) uncertainties in the QSPR-estimated chemical-specific parameters; (2) variability in physiological parameters; and (3) variability in exposure patterns. The results indicate that uncertainty in chemical-specific parameters makes only a minor contribution to the overall variability and uncertainty in the predicted TCE concentrations in blood. The model is used in a reverse dosimetry approach to derive estimates of TCE concentrations in drinking water based on given measurements of TCE in blood, for comparison to the U.S. EPA's Maximum Contaminant Level in drinking water. This example demonstrates how a reverse dosimetry approach can be used to facilitate interpretation of human biomonitoring data in a health risk context by deriving external exposures that are consistent with a biomonitoring data set, thereby permitting comparison with health-based exposure guidelines.  相似文献   

14.
A simple and useful characterization of many predictive models is in terms of model structure and model parameters. Accordingly, uncertainties in model predictions arise from uncertainties in the values assumed by the model parameters (parameter uncertainty) and the uncertainties and errors associated with the structure of the model (model uncertainty). When assessing uncertainty one is interested in identifying, at some level of confidence, the range of possible and then probable values of the unknown of interest. All sources of uncertainty and variability need to be considered. Although parameter uncertainty assessment has been extensively discussed in the literature, model uncertainty is a relatively new topic of discussion by the scientific community, despite being often the major contributor to the overall uncertainty. This article describes a Bayesian methodology for the assessment of model uncertainties, where models are treated as sources of information on the unknown of interest. The general framework is then specialized for the case where models provide point estimates about a single‐valued unknown, and where information about models are available in form of homogeneous and nonhomogeneous performance data (pairs of experimental observations and model predictions). Several example applications for physical models used in fire risk analysis are also provided.  相似文献   

15.
The Southern California Edison Company (SCE) has instituted a series of control strategies designed to minimize human exposure to polychlorinated biphenyls (PCBs) in electrical equipment used on its system. This paper describes a method of analyzing PCB risks using conservative estimates of human intake of PCBs originating from accidental spills from electrical equipment. The PCB releases from the Edison system were determined. The fate of these releases in soil, air, and water was analyzed to determine how much material reaches human receptors. The air and water pathways were determined to be the most likely candidates for the exposure and risk considerations. PCB intake via ingestion of soil at the spill site was neglected as an exposure pathway. Equipment spills without controls resulted in at the most 2 ng/day human intake of PCBs via the water exposure pathway. This was determined to be negligible in comparison with intake rates used in conjunction with the setting of food tolerance levels based on fish being the main dietary pathway of human exposure. The inhalation exposure of the hundred or so persons in the immediate vicinity of a spill was determined to equal the PCB intakes of the fish-eating subpopulation analyzed by the Food and Drug Administration for 2 ppm tolerance standard in the case of no controls or cleanup. Current cleanup procedures assure that even the persons in the immediate area are well below the intake of the subjects in the fish contamination analysis. All exposures were well below a "virtual safe dose" level estimated in the fish tolerance study.  相似文献   

16.
A Monte Carlo simulation is incorporated into a risk assessment for trichloroethylene (TCE) using physiologically-based pharmacokinetic (PBPK) modeling coupled with the linearized multistage model to derive human carcinogenic risk extrapolations. The Monte Carlo technique incorporates physiological parameter variability to produce a statistically derived range of risk estimates which quantifies specific uncertainties associated with PBPK risk assessment approaches. Both inhalation and ingestion exposure routes are addressed. Simulated exposure scenarios were consistent with those used by the Environmental Protection Agency (EPA) in their TCE risk assessment. Mean values of physiological parameters were gathered from the literature for both mice (carcinogenic bioassay subjects) and for humans. Realistic physiological value distributions were assumed using existing data on variability. Mouse cancer bioassay data were correlated to total TCE metabolized and area-under-the-curve (blood concentration) trichloroacetic acid (TCA) as determined by a mouse PBPK model. These internal dose metrics were used in a linearized multistage model analysis to determine dose metric values corresponding to 10-6 lifetime excess cancer risk. Using a human PBPK model, these metabolized doses were then extrapolated to equivalent human exposures (inhalation and ingestion). The Monte Carlo iterations with varying mouse and human physiological parameters produced a range of human exposure concentrations producing a 10-6 risk.  相似文献   

17.
Concern about the degree of uncertainty and potential conservatism in deterministic point estimates of risk has prompted researchers to turn increasingly to probabilistic methods for risk assessment. With Monte Carlo simulation techniques, distributions of risk reflecting uncertainty and/or variability are generated as an alternative. In this paper the compounding of conservatism(1) between the level associated with point estimate inputs selected from probability distributions and the level associated with the deterministic value of risk calculated using these inputs is explored. Two measures of compounded conservatism are compared and contrasted. The first measure considered, F , is defined as the ratio of the risk value, R d, calculated deterministically as a function of n inputs each at the j th percentile of its probability distribution, and the risk value, R j that falls at the j th percentile of the simulated risk distribution (i.e., F=Rd/Rj). The percentile of the simulated risk distribution which corresponds to the deterministic value, Rd , serves as a second measure of compounded conservatism. Analytical results for simple products of lognormal distributions are presented. In addition, a numerical treatment of several complex cases is presented using five simulation analyses from the literature to illustrate. Overall, there are cases in which conservatism compounds dramatically for deterministic point estimates of risk constructed from upper percentiles of input parameters, as well as those for which the effect is less notable. The analytical and numerical techniques discussed are intended to help analysts explore the factors that influence the magnitude of compounding conservatism in specific cases.  相似文献   

18.
Monte Carlo simulations are commonplace in quantitative risk assessments (QRAs). Designed to propagate the variability and uncertainty associated with each individual exposure input parameter in a quantitative risk assessment, Monte Carlo methods statistically combine the individual parameter distributions to yield a single, overall distribution. Critical to such an assessment is the representativeness of each individual input distribution. The authors performed a literature review to collect and compare the distributions used in published QRAs for the parameters of body weight, food consumption, soil ingestion rates, breathing rates, and fluid intake. To provide a basis for comparison, all estimated exposure parameter distributions were evaluated with respect to four properties: consistency, accuracy, precision, and specificity. The results varied depending on the exposure parameter. Even where extensive, well-collected data exist, investigators used a variety of different distributional shapes to approximate these data. Where such data do not exist, investigators have collected their own data, often leading to substantial disparity in parameter estimates and subsequent choice of distribution. The present findings indicate that more attention must be paid to the data underlying these distributional choices. More emphasis should be placed on sensitivity analyses, quantifying the impact of assumptions, and on discussion of sources of variation as part of the presentation of any risk assessment results. If such practices and disclosures are followed, it is believed that Monte Carlo simulations can greatly enhance the accuracy and appropriateness of specific risk assessments. Without such disclosures, researchers will be increasing the size of the risk assessment "black box," a concern already raised by many critics of more traditional risk assessments.  相似文献   

19.
We describe a one-dimensional probabilistic model of the role of domestic food handling behaviors on salmonellosis risk associated with the consumption of eggs and egg-containing foods. Six categories of egg-containing foods were defined based on the amount of egg contained in the food, whether eggs are pooled, and the degree of cooking practiced by consumers. We used bootstrap simulation to quantify uncertainty in risk estimates due to sampling error, and sensitivity analysis to identify key sources of variability and uncertainty in the model. Because of typical model characteristics such as nonlinearity, interaction between inputs, thresholds, and saturation points, Sobol's method, a novel sensitivity analysis approach, was used to identify key sources of variability. Based on the mean probability of illness, examples of foods from the food categories ranked from most to least risk of illness were: (1) home-made salad dressings/ice cream; (2) fried eggs/boiled eggs; (3) omelettes; and (4) baked foods/breads. For food categories that may include uncooked eggs (e.g., home-made salad dressings/ice cream), consumer handling conditions such as storage time and temperature after food preparation were the key sources of variability. In contrast, for food categories associated with undercooked eggs (e.g., fried/soft-boiled eggs), the initial level of Salmonella contamination and the log10 reduction due to cooking were the key sources of variability. Important sources of uncertainty varied with both the risk percentile and the food category under consideration. This work adds to previous risk assessments focused on egg production and storage practices, and provides a science-based approach to inform consumer risk communications regarding safe egg handling practices.  相似文献   

20.
The distributional approach for uncertainty analysis in cancer risk assessment is reviewed and extended. The method considers a combination of bioassay study results, targeted experiments, and expert judgment regarding biological mechanisms to predict a probability distribution for uncertain cancer risks. Probabilities are assigned to alternative model components, including the determination of human carcinogenicity, mode of action, the dosimetry measure for exposure, the mathematical form of the dose‐response relationship, the experimental data set(s) used to fit the relationship, and the formula used for interspecies extrapolation. Alternative software platforms for implementing the method are considered, including Bayesian belief networks (BBNs) that facilitate assignment of prior probabilities, specification of relationships among model components, and identification of all output nodes on the probability tree. The method is demonstrated using the application of Evans, Sielken, and co‐workers for predicting cancer risk from formaldehyde inhalation exposure. Uncertainty distributions are derived for maximum likelihood estimate (MLE) and 95th percentile upper confidence limit (UCL) unit cancer risk estimates, and the effects of resolving selected model uncertainties on these distributions are demonstrated, considering both perfect and partial information for these model components. A method for synthesizing the results of multiple mechanistic studies is introduced, considering the assessed sensitivities and selectivities of the studies for their targeted effects. A highly simplified example is presented illustrating assessment of genotoxicity based on studies of DNA damage response caused by naphthalene and its metabolites. The approach can provide a formal mechanism for synthesizing multiple sources of information using a transparent and replicable weight‐of‐evidence procedure.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号