首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 562 毫秒
1.
Information of exposure factors used in quantitative risk assessments has previously been compiled and reported for U.S. and European populations. However, due to the advancement of science and knowledge, these reports are in continuous need of updating with new data. Equally important is the change over time of many exposure factors related to both physiological characteristics and human behavior. Body weight, skin surface, time use, and dietary habits are some of the most obvious examples covered here. A wealth of data is available from literature not primarily gathered for the purpose of risk assessment. Here we review a number of key exposure factors and compare these factors between northern Europe—here represented by Sweden—and the United States. Many previous compilations of exposure factor data focus on interindividual variability and variability between sexes and age groups, while uncertainty is mainly dealt with in a qualitative way. In this article variability is assessed along with uncertainty. As estimates of central tendency and interindividual variability, mean, standard deviation, skewness, kurtosis, and multiple percentiles were calculated, while uncertainty was characterized using 95% confidence intervals for these parameters. The presented statistics are appropriate for use in deterministic analyses using point estimates for each input parameter as well as in probabilistic assessments.  相似文献   

2.
Methods to Approximate Joint Uncertainty and Variability in Risk   总被引:3,自引:0,他引:3  
As interest in quantitative analysis of joint uncertainty and interindividual variability (JUV) in risk grows, so does the need for related computational shortcuts. To quantify JUV in risk, Monte Carlo methods typically require nested sampling of JUV in distributed inputs, which is cumbersome and time-consuming. Two approximation methods proposed here allow simpler and more rapid analysis. The first consists of new upper-bound JUV estimators that involve only uncertainty or variability, not both, and so never require nested sampling to calculate. The second is a discrete-probability-calculus procedure that uses only the mean and one upper-tail mean for each input in order to estimate mean and upper-bound risk, which procedure is simpler and more intuitive than similar ones in use. Application of these methods is illustrated in an assessment of cancer risk from residential exposures to chloroform in Kanawah Valley, West Virginia. Because each of the multiple exposure pathways considered in this assessment had separate modeled sources of uncertainty and variability, the assessment illustrates a realistic case where a standard Monte Carlo approach to JUV analysis requires nested sampling. In the illustration, the first proposed method quantified JUV in cancer risk much more efficiently than corresponding nested Monte Carlo calculations. The second proposed method also nearly duplicated JUV-related and other estimates of risk obtained using Monte Carlo methods. Both methods were thus found adequate to obtain basic risk estimates accounting for JUV in a realistically complex risk assessment. These methods make routine JUV analysis more convenient and practical.  相似文献   

3.
An integrated, quantitative approach to incorporating both uncertainty and interindividual variability into risk prediction models is described. Individual risk R is treated as a variable distributed in both an uncertainty dimension and a variability dimension, whereas population risk I (the number of additional cases caused by R) is purely uncertain. I is shown to follow a compound Poisson-binomial distribution, which in low-level risk contexts can often be approximated well by a corresponding compound Poisson distribution. The proposed analytic framework is illustrated with an application to cancer risk assessment for a California population exposed to 1,2-dibromo-3-chloropropane from ground water.  相似文献   

4.
Noncancer risk assessment traditionally relies on applied dose measures, such as concentration in inhaled air or in drinking water, to characterize no-effect levels or low-effect levels in animal experiments. Safety factors are then incorporated to address the uncertainties associated with extrapolating across species, dose levels, and routes of exposure, as well as to account for the potential impact of variability of human response. A risk assessment for chloropentafluorobenzene (CPFB) was performed in which a physiologically based pharmacokinetic model was employed to calculate an internal measure of effective tissue dose appropriate to each toxic endpoint. The model accurately describes the kinetics of CPFB in both rodents and primates. The model calculations of internal dose at the no-effect and low-effect levels in animals were compared with those calculated for potential human exposure scenarios. These calculations were then used in place of default interspecies and route-to-route safety factors to determine safe human exposure conditions. Estimates of the impact of model parameter uncertainty, as estimated by a Monte Carlo technique, also were incorporated into the assessment. The approach used for CPFB is recommended as a general methodology for noncancer risk assessment whenever the necessary pharmacokinetic data can be obtained.  相似文献   

5.
This paper presents a general model for exposure to homegrown foods that is used with a Monte Carlo analysis to determine the relative contributions of variability (Type A uncertainty) and true uncertainty (Type B uncertainty) to the overall variance in prediction of the dose-to-concentration ratio. Although classification of exposure inputs as uncertain or variable is somewhat subjective, food consumption rates and exposure duration are judged to have a predicted variance that is dominated by variability among individuals by age, income, culture, and geographical region. Whereas, biotransfer factors and partition factors are inputs that, to a large extent, involve uncertainty. Using ingestion of fruits, vegetables, grains, dairy products, and meat and soils assumed to be contaminated by hexachlorbenzene (HCB) and benzo(a)pyrene (BaP) as cases studies, a Monte Carlo analysis is used to explore the relative contribution of uncertainty and variability to overall variance in the estimated distribution of potential dose within the population that consumes homegrown foods. It is found that, when soil concentrations are specified, variances in ratios of dose-to-concentration for HCB are equally attributable to uncertainty and variability, whereas for BaP, variance in these ratios is dominated by true uncertainty.  相似文献   

6.
Quantitative approaches to assessing exposure to, and associated risk from, benzene in mineral spirits solvent (MSS), used widely in parts washing and degreasing operations, have focused primarily on the respiratory pathway. The dermal contribution to total benzene uptake from such operations remains uncertain because measuring in vivo experimental dermal uptake of this volatile human carcinogen is difficult. Unprotected dermal uptake involves simultaneous sustained immersion events and transient splash/wipe events, each yielding residues subject to evaporation as well as dermal uptake. A two‐process dermal exposure framework to assess dermal uptake to normal and damaged skin was applied to estimate potential daily dermal benzene dose (Dskin) to workers who used historical or current formulations of recycled MSS in manual parts washers. Measures of evaporation and absorption of MSS dermally applied to human subjects were modeled to estimate in vivo dermal uptake of benzene in MSS. Uncertainty and interindividual variability in Dskin was characterized by Monte Carlo simulation, conditioned on uncertainty and/or variability estimated for each model input. Dermal exposures are estimated to average 33% of total (inhalation + dermal) benzene parts washing dose, with approximately equal predicted portions of dermal dose due to splash/wipe and to continuous contact with MSS. The estimated median (95th percentile) dermal and total daily benzene doses from parts washing are: 0.0069 (0.024) and 0.025 (0.18) mg/day using current, and 0.027 (0.085) and 0.098 (0.69) mg/day using historical, MSS solvents, respectively.  相似文献   

7.
For noncancer effects, the degree of human interindividual variability plays a central role in determining the risk that can be expected at low exposures. This discussion reviews available data on observations of interindividual variability in (a) breathing rates, based on observations in British coal miners; (b) systemic pharmacokinetic parameters, based on studies of a number of drugs; (c) susceptibility to neurological effects from fetal exposure to methyl mercury, based on observations of the incidence of effects in relation to hair mercury levels; and (d) chronic lung function changes in relation to long-term exposure to cigarette smoke. The quantitative ranges of predictions that follow from uncertainties in estimates of interindividual variability in susceptibility are illustrated.  相似文献   

8.
Human variability is a very important factor considered in human health risk assessment for protecting sensitive populations from chemical exposure. Traditionally, to account for this variability, an interhuman uncertainty factor is applied to lower the exposure limit. However, using a fixed uncertainty factor rather than probabilistically accounting for human variability can hardly support probabilistic risk assessment advocated by a number of researchers; new methods are needed to probabilistically quantify human population variability. We propose a Bayesian hierarchical model to quantify variability among different populations. This approach jointly characterizes the distribution of risk at background exposure and the sensitivity of response to exposure, which are commonly represented by model parameters. We demonstrate, through both an application to real data and a simulation study, that using the proposed hierarchical structure adequately characterizes variability across different populations.  相似文献   

9.
Topics in Microbial Risk Assessment: Dynamic Flow Tree Process   总被引:5,自引:0,他引:5  
Microbial risk assessment is emerging as a new discipline in risk assessment. A systematic approach to microbial risk assessment is presented that employs data analysis for developing parsimonious models and accounts formally for the variability and uncertainty of model inputs using analysis of variance and Monte Carlo simulation. The purpose of the paper is to raise and examine issues in conducting microbial risk assessments. The enteric pathogen Escherichia coli O157:H7 was selected as an example for this study due to its significance to public health. The framework for our work is consistent with the risk assessment components described by the National Research Council in 1983 (hazard identification; exposure assessment; dose-response assessment; and risk characterization). Exposure assessment focuses on hamburgers, cooked a range of temperatures from rare to well done, the latter typical for fast food restaurants. Features of the model include predictive microbiology components that account for random stochastic growth and death of organisms in hamburger. For dose-response modeling, Shigella data from human feeding studies were used as a surrogate for E. coli O157:H7. Risks were calculated using a threshold model and an alternative nonthreshold model. The 95% probability intervals for risk of illness for product cooked to a given internal temperature spanned five orders of magnitude for these models. The existence of even a small threshold has a dramatic impact on the estimated risk.  相似文献   

10.
Marc Kennedy  Andy Hart 《Risk analysis》2009,29(10):1427-1442
We propose new models for dealing with various sources of variability and uncertainty that influence risk assessments for dietary exposure. The uncertain or random variables involved can interact in complex ways, and the focus is on methodology for integrating their effects and on assessing the relative importance of including different uncertainty model components in the calculation of dietary exposures to contaminants, such as pesticide residues. The combined effect is reflected in the final inferences about the population of residues and subsequent exposure assessments. In particular, we show how measurement uncertainty can have a significant impact on results and discuss novel statistical options for modeling this uncertainty. The effect of measurement error is often ignored, perhaps due to the laboratory process conforming to the relevant international standards, for example, or is treated in an  ad hoc  way. These issues are common to many dietary risk analysis problems, and the methods could be applied to any food and chemical of interest. An example is presented using data on carbendazim in apples and consumption surveys of toddlers.  相似文献   

11.
A method is proposed for integrated probabilistic risk assessment where exposure assessment and hazard characterization are both included in a probabilistic way. The aim is to specify the probability that a random individual from a defined (sub)population will have an exposure high enough to cause a particular health effect of a predefined magnitude, the critical effect size ( CES ). The exposure level that results in exactly that CES in a particular person is that person's individual critical effect dose ( ICED ). Individuals in a population typically show variation, both in their individual exposure ( IEXP ) and in their ICED . Both the variation in IEXP and the variation in ICED are quantified in the form of probability distributions. Assuming independence between both distributions, they are combined (by Monte Carlo) into a distribution of the individual margin of exposure ( IMoE ). The proportion of the IMoE distribution below unity is the probability of critical exposure ( PoCE ) in the particular (sub)population. Uncertainties involved in the overall risk assessment (i.e., both regarding exposure and effect assessment) are quantified using Monte Carlo and bootstrap methods. This results in an uncertainty distribution for any statistic of interest, such as the probability of critical exposure ( PoCE ). The method is illustrated based on data for the case of dietary exposure to the organophosphate acephate. We present plots that concisely summarize the probabilistic results, retaining the distinction between variability and uncertainty. We show how the relative contributions from the various sources of uncertainty involved may be quantified.  相似文献   

12.
Monte Carlo simulations are commonplace in quantitative risk assessments (QRAs). Designed to propagate the variability and uncertainty associated with each individual exposure input parameter in a quantitative risk assessment, Monte Carlo methods statistically combine the individual parameter distributions to yield a single, overall distribution. Critical to such an assessment is the representativeness of each individual input distribution. The authors performed a literature review to collect and compare the distributions used in published QRAs for the parameters of body weight, food consumption, soil ingestion rates, breathing rates, and fluid intake. To provide a basis for comparison, all estimated exposure parameter distributions were evaluated with respect to four properties: consistency, accuracy, precision, and specificity. The results varied depending on the exposure parameter. Even where extensive, well-collected data exist, investigators used a variety of different distributional shapes to approximate these data. Where such data do not exist, investigators have collected their own data, often leading to substantial disparity in parameter estimates and subsequent choice of distribution. The present findings indicate that more attention must be paid to the data underlying these distributional choices. More emphasis should be placed on sensitivity analyses, quantifying the impact of assumptions, and on discussion of sources of variation as part of the presentation of any risk assessment results. If such practices and disclosures are followed, it is believed that Monte Carlo simulations can greatly enhance the accuracy and appropriateness of specific risk assessments. Without such disclosures, researchers will be increasing the size of the risk assessment "black box," a concern already raised by many critics of more traditional risk assessments.  相似文献   

13.
Dose‐response models are the essential link between exposure assessment and computed risk values in quantitative microbial risk assessment, yet the uncertainty that is inherent to computed risks because the dose‐response model parameters are estimated using limited epidemiological data is rarely quantified. Second‐order risk characterization approaches incorporating uncertainty in dose‐response model parameters can provide more complete information to decisionmakers by separating variability and uncertainty to quantify the uncertainty in computed risks. Therefore, the objective of this work is to develop procedures to sample from posterior distributions describing uncertainty in the parameters of exponential and beta‐Poisson dose‐response models using Bayes's theorem and Markov Chain Monte Carlo (in OpenBUGS). The theoretical origins of the beta‐Poisson dose‐response model are used to identify a decomposed version of the model that enables Bayesian analysis without the need to evaluate Kummer confluent hypergeometric functions. Herein, it is also established that the beta distribution in the beta‐Poisson dose‐response model cannot address variation among individual pathogens, criteria to validate use of the conventional approximation to the beta‐Poisson model are proposed, and simple algorithms to evaluate actual beta‐Poisson probabilities of infection are investigated. The developed MCMC procedures are applied to analysis of a case study data set, and it is demonstrated that an important region of the posterior distribution of the beta‐Poisson dose‐response model parameters is attributable to the absence of low‐dose data. This region includes beta‐Poisson models for which the conventional approximation is especially invalid and in which many beta distributions have an extreme shape with questionable plausibility.  相似文献   

14.
We investigate, through modeling, the impact of interindividual heterogeneity in the metabolism of 4-aminobiphenyl (ABP) and in physiological factors on human cancer risk: A physiological pharmacokinetic model was used to quantify the time course of the formation of the proximate carcinogen, N-hydroxy-4-ABP and the DNA-binding of the active species in the bladder. The metabolic and physiologic model parameters were randomly varied, via Monte Carlo simulations, to reproduce interindividual variability. The sampling means for most parameters were scaled from values developed by Kadlubar et al. (Cancer Res., 51 : 4371, 1991) for dogs; variances were obtained primarily from published human data (e.g., measurements of ABP N-oxidation, and arylamine N-acetylation in human liver tissue). In 500 simulations, theoretically representing 500 humans, DNA-adduct levels in the bladder of the most susceptible individuals are ten thousand times higher than for the least susceptible, and the 5th and 95th percentiles differ by a factor of 160. DNA binding for the most susceptible individual (with low urine pH, low N-acetylation and high N-oxidation activities) is theoretically one million-fold higher than for the least susceptible (with high urine pH, high N-acetylation and low N-oxidation activities). The simulations also suggest that the four factors contributing most significantly to interindividual differences in DNA-binding of ABP in human bladder are urine pH, ABP N-oxidation, ABP N-acetylation and urination frequency.  相似文献   

15.
Benzene is myelotoxic and leukemogenic in humans exposed at high doses (>1 ppm, more definitely above 10 ppm) for extended periods. However, leukemia risks at lower exposures are uncertain. Benzene occurs widely in the work environment and also indoor air, but mostly below 1 ppm, so assessing the leukemia risks at these low concentrations is important. Here, we describe a human physiologically-based pharmacokinetic (PBPK) model that quantifies tissue doses of benzene and its key metabolites, benzene oxide, phenol, and hydroquinone after inhalation and oral exposures. The model was integrated into a statistical framework that acknowledges sources of variation due to inherent intra- and interindividual variation, measurement error, and other data collection issues. A primary contribution of this work is the estimation of population distributions of key PBPK model parameters. We hypothesized that observed interindividual variability in the dosimetry of benzene and its metabolites resulted primarily from known or estimated variability in key metabolic parameters and that a statistical PBPK model that explicitly included variability in only those metabolic parameters would sufficiently describe the observed variability. We then identified parameter distributions for the PBPK model to characterize observed variability through the use of Markov chain Monte Carlo analysis applied to two data sets. The identified parameter distributions described most of the observed variability, but variability in physiological parameters such as organ weights may also be helpful to faithfully predict the observed human-population variability in benzene dosimetry.  相似文献   

16.
The use of probabilistic approaches in exposure assessments of contaminants migrating from food packages is of increasing interest but the lack of concentration or migration data is often referred as a limitation. Data accounting for the variability and uncertainty that can be expected in migration, for example, due to heterogeneity in the packaging system, variation of the temperature along the distribution chain, and different time of consumption of each individual package, are required for probabilistic analysis. The objective of this work was to characterize quantitatively the uncertainty and variability in estimates of migration. A Monte Carlo simulation was applied to a typical solution of the Fick's law with given variability in the input parameters. The analysis was performed based on experimental data of a model system (migration of Irgafos 168 from polyethylene into isooctane) and illustrates how important sources of variability and uncertainty can be identified in order to refine analyses. For long migration times and controlled conditions of temperature the affinity of the migrant to the food can be the major factor determining the variability in the migration values (more than 70% of variance). In situations where both the time of consumption and temperature can vary, these factors can be responsible, respectively, for more than 60% and 20% of the variance in the migration estimates. The approach presented can be used with databases from consumption surveys to yield a true probabilistic estimate of exposure.  相似文献   

17.
Human health risk assessments use point values to develop risk estimates and thus impart a deterministic character to risk, which, by definition, is a probability phenomenon. The risk estimates are calculated based on individuals and then, using uncertainty factors (UFs), are extrapolated to the population that is characterized by variability. Regulatory agencies have recommended the quantification of the impact of variability in risk assessments through the application of probabilistic methods. In the present study, a framework that deals with the quantitative analysis of uncertainty (U) and variability (V) in target tissue dose in the population was developed by applying probabilistic analysis to physiologically-based toxicokinetic models. The mechanistic parameters that determine kinetics were described with probability density functions (PDFs). Since each PDF depicts the frequency of occurrence of all expected values of each parameter in the population, the combined effects of multiple sources of U/V were accounted for in the estimated distribution of tissue dose in the population, and a unified (adult and child) intraspecies toxicokinetic uncertainty factor UFH-TK was determined. The results show that the proposed framework accounts effectively for U/V in population toxicokinetics. The ratio of the 95th percentile to the 50th percentile of the annual average concentration of the chemical at the target tissue organ (i.e., the UFH-TK) varies with age. The ratio is equivalent to a unified intraspecies toxicokinetic UF, and it is one of the UFs by which the NOAEL can be divided to obtain the RfC/RfD. The 10-fold intraspecies UF is intended to account for uncertainty and variability in toxicokinetics (3.2x) and toxicodynamics (3.2x). This article deals exclusively with toxicokinetic component of UF. The framework provides an alternative to the default methodology and is advantageous in that the evaluation of toxicokinetic variability is based on the distribution of the effective target tissue dose, rather than applied dose. It allows for the replacement of the default adult and children intraspecies UF with toxicokinetic data-derived values and provides accurate chemical-specific estimates for their magnitude. It shows that proper application of probability and toxicokinetic theories can reduce uncertainties when establishing exposure limits for specific compounds and provide better assurance that established limits are adequately protective. It contributes to the development of a probabilistic noncancer risk assessment framework and will ultimately lead to the unification of cancer and noncancer risk assessment methodologies.  相似文献   

18.
The appearance of measurement error in exposure and risk factor data potentially affects any inferences regarding variability and uncertainty because the distribution representing the observed data set deviates from the distribution that represents an error-free data set. A methodology for improving the characterization of variability and uncertainty with known measurement errors in data is demonstrated in this article based on an observed data set, known measurement error, and a measurement-error model. A practical method for constructing an error-free data set is presented and a numerical method based upon bootstrap pairs, incorporating two-dimensional Monte Carlo simulation, is introduced to address uncertainty arising from measurement error in selected statistics. When measurement error is a large source of uncertainty, substantial differences between the distribution representing variability of the observed data set and the distribution representing variability of the error-free data set will occur. Furthermore, the shape and range of the probability bands for uncertainty differ between the observed and error-free data set. Failure to separately characterize contributions from random sampling error and measurement error will lead to bias in the variability and uncertainty estimates. However, a key finding is that total uncertainty in mean can be properly quantified even if measurement and random sampling errors cannot be separated. An empirical case study is used to illustrate the application of the methodology.  相似文献   

19.
The human toxicity potential, a weighting scheme used to evaluate toxic emissions for life cycle assessment and toxics release inventories, is based on potential dose calculations and toxicity factors. This paper evaluates the variance in potential dose calculations that can be attributed to the uncertainty in chemical-specific input parameters as well as the variability in exposure factors and landscape parameters. A knowledge of the uncertainty allows us to assess the robustness of a decision based on the toxicity potential; a knowledge of the sources of uncertainty allows us to focus our resources if we want to reduce the uncertainty. The potential dose of 236 chemicals was assessed. The chemicals were grouped by dominant exposure route, and a Monte Carlo analysis was conducted for one representative chemical in each group. The variance is typically one to two orders of magnitude. For comparison, the point estimates in potential dose for 236 chemicals span ten orders of magnitude. Most of the variance in the potential dose is due to chemical-specific input parameters, especially half-lives, although exposure factors such as fish intake and the source of drinking water can be important for chemicals whose dominant exposure is through indirect routes. Landscape characteristics are generally of minor importance.  相似文献   

20.
Since the National Food Safety Initiative of 1997, risk assessment has been an important issue in food safety areas. Microbial risk assessment is a systematic process for describing and quantifying a potential to cause adverse health effects associated with exposure to microorganisms. Various dose-response models for estimating microbial risks have been investigated. We have considered four two-parameter models and four three-parameter models in order to evaluate variability among the models for microbial risk assessment using infectivity and illness data from studies with human volunteers exposed to a variety of microbial pathogens. Model variability is measured in terms of estimated ED01s and ED10s, with the view that these effective dose levels correspond to the lower and upper limits of the 1% to 10% risk range generally recommended for establishing benchmark doses in risk assessment. Parameters of the statistical models are estimated using the maximum likelihood method. In this article a weighted average of effective dose estimates from eight two- and three-parameter dose-response models, with weights determined by the Kullback information criterion, is proposed to address model uncertainties in microbial risk assessment. The proposed procedures for incorporating model uncertainties and making inferences are illustrated with human infection/illness dose-response data sets.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号