首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
This paper demonstrates a new methodology for probabilistic public health risk assessment using the first-order reliability method. The method provides the probability that incremental lifetime cancer risk exceeds a threshold level, and the probabilistic sensitivity quantifying the relative impact of considering the uncertainty of each random variable on the exceedance probability. The approach is applied to a case study given by Thompson et al. (1) on cancer risk caused by ingestion of benzene-contaminated soil, and the results are compared to that of the Monte Carlo method. Parametric sensitivity analyses are conducted to assess the sensitivity of the probabilistic event with respect to the distribution parameters of the basic random variables, such as the mean and standard deviation. The technique is a novel approach to probabilistic risk assessment, and can be used in situations when Monte Carlo analysis is computationally expensive, such as when the simulated risk is at the tail of the risk probability distribution.  相似文献   

2.
Concern about the degree of uncertainty and potential conservatism in deterministic point estimates of risk has prompted researchers to turn increasingly to probabilistic methods for risk assessment. With Monte Carlo simulation techniques, distributions of risk reflecting uncertainty and/or variability are generated as an alternative. In this paper the compounding of conservatism(1) between the level associated with point estimate inputs selected from probability distributions and the level associated with the deterministic value of risk calculated using these inputs is explored. Two measures of compounded conservatism are compared and contrasted. The first measure considered, F , is defined as the ratio of the risk value, R d, calculated deterministically as a function of n inputs each at the j th percentile of its probability distribution, and the risk value, R j that falls at the j th percentile of the simulated risk distribution (i.e., F=Rd/Rj). The percentile of the simulated risk distribution which corresponds to the deterministic value, Rd , serves as a second measure of compounded conservatism. Analytical results for simple products of lognormal distributions are presented. In addition, a numerical treatment of several complex cases is presented using five simulation analyses from the literature to illustrate. Overall, there are cases in which conservatism compounds dramatically for deterministic point estimates of risk constructed from upper percentiles of input parameters, as well as those for which the effect is less notable. The analytical and numerical techniques discussed are intended to help analysts explore the factors that influence the magnitude of compounding conservatism in specific cases.  相似文献   

3.
Trichloroacetic acid (TCA) is major metabolite of trichloroethylene (TRI) thought to contribute to its hepatocarcinogenic effects in mice. Recent studies have shown that peak blood concentrations of TCA in rats do not occur until approximately 12 hours following an oral dose of TRI. However, blood concentrations of TRI reach maximum within an hour and are nondetectable after 2 hours.(1) The results of study which examined the enterohepatic recirculation (EHC) of the principle TRI metabolited(2) was used to develop physiologically-based pharmacokinetic model for TRI, which includes enterohepatic recirculation of its metabolites. The model quantitatively predicts the uptake, distribution and elimination of TRI, trichloroethanol, trichloroethanol-glucuronide, and TCA and includes production of metabolites through the enterohepatic recirculation pathway. Physiologic parameters used in the model were obtained from the literature.(3.4) Parameters for TRI metabolism were taken from Fisher et al.(5) Other kinetic parameters were found in the literature or estimated from experimental data.(2) The model was calibrated to data from experiments of an earlier study where TRI was orally administered(2) Verification of the model was conducted using data on the enterohepatic recirculation of TCEOH and TCA(2) chloral hydrate data (infusion doses) from Merdink,(1) and TRI data from Templin(l) and Larson and Bull.(1)  相似文献   

4.
Parodi et al. (1) and Zeise et al. (2) found a surprising statistical correlation (or association) between acute toxicity and carcinogenic potency. In order to shed light on the questions of whether or not it is a causal correlation, and whether or not it is a statistical or tautological artifact, we have compared the correlations for the NCI/NTP data set with those for chemicals not in this set. Carcinogenic potencies were taken from the Gold et al. database. We find a weak correlation with an average value of TD50/LD50= 0.04 for the non-NCI data set, compared with TD50/LD50= 0.15 for the NCI data set. We conclude that it is not easy to distinguish types of carcinogens on the basis of whether or not they are acutely toxic.  相似文献   

5.
It has recently been suggested that "standard" data distributions for key exposure variables should be developed wherever appropriate for use in probabilistic or "Monte Carlo" exposure analyses. Soil-on-skin adherence estimates represent an ideal candidate for development of a standard data distribution: There are several readily available studies which offer a consistent pattern of reported results, and more importantly, soil adherence to skin is likely to vary little from site-to-site. In this paper, we thoroughly review each of the published soil adherence studies with respect to study design, sampling, and analytical methods, and level of confidence in the reported results. Based on these studies, probability density functions (PDF) of soil adherence values were examined for different age groups and different sampling techniques. The soil adherence PDF developed from adult data was found to resemble closely the soil adherence PDF based on child data in terms of both central tendency (mean = 0.49 and 0.63 mg-soil/cm2-skin, respectively) and 95th percentile values (1.6 and 2.4 mg-soil/cm2-skin, respectively). Accordingly, a single, "standard" PDF is presented based on all data collected for all age groups. This standard PDF is lognormally distributed; the arithmetic mean and standard deviation are 0.52 ± 0.9 mg-soil/cm2-skin. Since our review of the literature indicates that soil adherence under environmental conditions will be minimally influenced by age, sex, soil type, or particle size, this PDF should be considered applicable to all settings. The 50th and 95th percentile values of the standard PDF (0.25 and 1.7 mg-soil/cm2-skin, respectively) are very similar to recent U.S. EPA estimates of "average" and "upper-bound" soil adherence (0.2 and 1.0 mg-soil/cm2-skin, respectively).  相似文献   

6.
The existence of correlation between the carcinogenic potency and the maximum tolerated dose has been the subject of many investigations in recent years. Several attempts have been made to quantify this correlation in different bioassay experiments. By using some distributional assumptions, Krewski et al .(1) derive an analytic expression for the coefficient of correlation between the carcinogenic potency TD50 and the maximum tolerated dose. Here, we discuss the deviation that may result in using their analytical expression. By taking a more general approach we derive an expression for the correlation coefficient which includes the result of Krewski et al .(1) as a special case, and show that their expression may overestimate the correlation in some instances and yet underestimate the correlation in other instances. The proposed method is illustrated by application to a real dataset.  相似文献   

7.
The recent decision of the U.S. Supreme Court on the regulation of CO2 emissions from new motor vehicles( 1 ) shows the need for a robust methodology to evaluate the fraction of attributable risk from such emissions. The methodology must enable decisionmakers to reach practically relevant conclusions on the basis of expert assessments the decisionmakers see as an expression of research in progress, rather than as knowledge consolidated beyond any reasonable doubt.( 2,3,4 ) This article presents such a methodology and demonstrates its use for the Alpine heat wave of 2003. In a Bayesian setting, different expert assessments on temperature trends and volatility can be formalized as probability distributions, with initial weights (priors) attached to them. By Bayesian learning, these weights can be adjusted in the light of data. The fraction of heat wave risk attributable to anthropogenic climate change can then be computed from the posterior distribution. We show that very different priors consistently lead to the result that anthropogenic climate change has contributed more than 90% to the probability of the Alpine summer heat wave in 2003. The present method can be extended to a wide range of applications where conclusions must be drawn from divergent assessments under uncertainty.  相似文献   

8.
What Do We Know About Making Risk Comparisons?   总被引:2,自引:0,他引:2  
The risks of unfamiliar technologies are often evaluated by comparing them with the risks of more familiar ones. Such risk comparisons have been criticized for neglecting critical dimensions of risky decisions. In a guide written for the Chemical Manufacturers Association, Covello et al. (1) have summarized these critiques and developed a taxonomy that characterizes possible risk comparisons in terms of their acceptability (or objectionableness). We asked four diverse groups of subjects to judge the acceptability of 14 statements produced by Covello et al. as examples of their categories. We found no correlation between the judgments of acceptability produced by our subjects and those predicted by Covello et al. .  相似文献   

9.
Cross-Cultural Differences in Risk Perception: A Model-Based Approach   总被引:4,自引:0,他引:4  
The present study assessed cross-cultural differences in the perception of financial risks. Students at large universities in Hong Kong, Taiwan, the Netherlands, and the U.S., as well as a group of Taiwanese security analysts rated the riskiness of a set of monetary lotteries. Risk judgments differed with nationality, but not with occupation (students vs. security analysts) and were modeled by the Conjoint Expected Risk (CER) model.(1) Consistent with cultural differences in country uncertainty avoidance,(2) CER model parameters of respondents from the two Western countries differed from those of respondents from the two countries with Chinese cultural roots: The risk judgments of respondents from Hong Kong and Taiwan were more sensitive to the magnitude of potential losses and less mitigated by the probability of positive outcomes.  相似文献   

10.
Lifetime cancer potency of alfatoxin was assessed based on the Yeh et al. study from China in which both aflatoxin exposure and hepatitis B prevalence were measured. This study provides the best available information for estimating the carcinogenic risk posed by aflatoxin to the U.S. population. Cancer potency of aflatoxin was estimated using a biologically motivated risk assessment model. The best estimate of aflatoxin potency was 9 (mg/kg/day)−1 for individuals negative for hepatitis B and 230 (mg/kg/day)−1 for individuals positive for hepatitis B.  相似文献   

11.
This study tries to assess the risk of deaths and injuries from motor vehicle accidents associated with an evacuation of population groups in case of nuclear plant accidents. The risk per person–km is evaluated using: (a) data from previous evacuation: information from Soufriere evacuation (Guadeloupe Island 1976) and Mississauga (1979), added to Hans and Sell's data: no road accident occurred for a sample of 1,500,000 persons; (b) national recording system for motor vehicle accident: the rates of 2.2 10 -8 deaths per person–km and 32 10-8 injuries per person–km is calculated as an average. These last rates in France overestimate the number of casualties. A reasonable hypothesis is to assume that the probability of road accident occurrence follows a Poisson distribution, as these events are independent and unfrequent, as no accident was observed in a sample of 1,500,000 persons the probability is between 0 and an upper value of 0.24 10-8 deaths per person-km and 3.29 10-8 injuries per person–km. The average and maximum population involved within different radii around French and U.S. Nuclear power sites are taken as a sample size in order to study the total risk of deaths and injuries in the hypothesis of an evacuation being necessary to protect the populations.  相似文献   

12.
There is continuing concern for the exposure of persons to various chlorinated organics via the environment, for example, chlorinated disinfection byproducts in drinking water.(1) Some of these may be carcinogenic,(2) although the evidence is far from strong.(3) There is an accumulating body of evidence that one of the normal human immunological responses to foreign agents is the generation of hypochlorous acid. This evidence will be summarized. The possibility that this HOCl generated in vivo could result in the formation of organo-chlorine compounds does not appear to have been seriously considered. Based on best available information, the amount of such byproduct formation will be estimated.  相似文献   

13.
Probabilistic risk assessments are enjoying increasing popularity as a tool to characterize the health hazards associated with exposure to chemicals in the environment. Because probabilistic analyses provide much more information to the risk manager than standard “point” risk estimates, this approach has generally been heralded as one which could significantly improve the conduct of health risk assessments. The primary obstacles to replacing point estimates with probabilistic techniques include a general lack of familiarity with the approach and a lack of regulatory policy and guidance. This paper discusses some of the advantages and disadvantages of the point estimate vs. probabilistic approach. Three case studies are presented which contrast and compare the results of each. The first addresses the risks associated with household exposure to volatile chemicals in tapwater. The second evaluates airborne dioxin emissions which can enter the food-chain. The third illustrates how to derive health-based cleanup levels for dioxin in soil. It is shown that, based on the results of Monte Carlo analyses of probability density functions (PDFs), the point estimate approach required by most regulatory agencies will nearly always overpredict the risk for the 95th percentile person by a factor of up to 5. When the assessment requires consideration of 10 or more exposure variables, the point estimate approach will often predict risks representative of the 99.9th percentile person rather than the 50th or 95th percentile person. This paper recommends a number of data distributions for various exposure variables that we believe are now sufficiently well understood to be used with confidence in most exposure assessments. A list of exposure variables that may require additional research before adequate data distributions can be developed are also discussed.  相似文献   

14.
Pharmacokinetic models which incorporate independently measured anatomical characteristics and physiological flows have been widely used to predict the pharmacokinetic behavior of drugs, anesthetics, and other chemicals. Models appearing in the literature have included as many as 18,(1) or as few as 5 tissue compartments.(2) With the exception of the multiple-compartment delay trains used by Bischoff(3) to model the delays inherent to the appearance of drug metabolites in bile and segments of the intestinal lumen, very little effort has been made to incorporate the available information on gastrointestinal anatomy and physiology into more accurate gastrointestinal absorption/enterohepatic recirculation submodels. Since several authors have shown that the lymphatic system is the most significant route of absorption for highly lipophilic chemicals, we have constructed a model of gastrointestinal absorption that emphasizes chylomicron production and transport as the most significant route of absorption for nonvolatile, lipophilic chemicals. The absorption and distribution of hexachlorobenzene after intravenous vs. oral dosing are used to demonstrate features of this model.  相似文献   

15.
The purpose of this paper is to undertake a statistical analysis to specify empirical distributions and to estimate univariate parametric probability distributions for air exchange rates for residential structures in the United States. To achieve this goal, we used data compiled by the Brookhaven National Laboratory using a method known as the perfluorocarbon tracer (PFT) technique. While these data are not fully representative of all areas of the country or all housing types, they are judged to be by far the best available. The analysis is characterized by four key points: the use of data for 2,844 households; a four-region breakdown based on heating degree days, a best available measure of climatic factors affecting air exchange rates; estimation of lognormal distributions as well as provision of empirical (frequency) distributions; and provision of these distributions for all of the data, for the data segmented by the four regions, for the data segmented by the four seasons, and for the data segmented by a 16 region by season breakdown. Except in a few cases, primarily for small sample sizes, air exchange rates were found to be well fit by lognormal distributions (adjusted R2 0.95). The empirical or lognormal distributions may be used in indoor air models or as input variables for probabilistic human health risk assessments.  相似文献   

16.
There are a number of sources of variability in food consumption patterns and residue levels of a particular chemical (e.g., pesticide, food additive) in commodities that lead to an expected high level of variability in dietary exposures across a population. This paper focuses on examples of consumption pattern survey data for specific commodities, namely that for wine and grape juice, and demonstrates how such data might be analyzed in preparation for performing stochastic analyses of dietary exposure. Data from the NIAAA/NHIS wine consumption survey were subset for gender and age group and, with matched body weight data from the survey database, were used to define empirically-based percentile estimates for wine intake (μl wine/kg body weight) for the strata of interest. The data for these two subpopulations were analyzed to estimate 14-day consumption distributional statistics and distributions for only those days on which wine was consumed. Data subsets for all wine-consuming adults and wine-consuming females ages 18 through 45, were determined to fit a lognormal distribution ( R 2= 0.99 for both datasets). Market share data were incorporated into estimation of chronic exposures to hypothetical chemical residues in imported table wine. As a separate example, treatment of grape juice consumption data for females, ages 18–40, as a simple lognormal distribution resulted in a significant underestimation of intake, and thus exposure, because the actual distribution is a mixture (i.e., multiple subpopulations of grape juice consumers exist in the parent distribution). Thus, deriving dietary intake statistics from food consumption survey data requires careful analysis of the underlying empirical distributions.  相似文献   

17.
Human health risk assessments use point values to develop risk estimates and thus impart a deterministic character to risk, which, by definition, is a probability phenomenon. The risk estimates are calculated based on individuals and then, using uncertainty factors (UFs), are extrapolated to the population that is characterized by variability. Regulatory agencies have recommended the quantification of the impact of variability in risk assessments through the application of probabilistic methods. In the present study, a framework that deals with the quantitative analysis of uncertainty (U) and variability (V) in target tissue dose in the population was developed by applying probabilistic analysis to physiologically-based toxicokinetic models. The mechanistic parameters that determine kinetics were described with probability density functions (PDFs). Since each PDF depicts the frequency of occurrence of all expected values of each parameter in the population, the combined effects of multiple sources of U/V were accounted for in the estimated distribution of tissue dose in the population, and a unified (adult and child) intraspecies toxicokinetic uncertainty factor UFH-TK was determined. The results show that the proposed framework accounts effectively for U/V in population toxicokinetics. The ratio of the 95th percentile to the 50th percentile of the annual average concentration of the chemical at the target tissue organ (i.e., the UFH-TK) varies with age. The ratio is equivalent to a unified intraspecies toxicokinetic UF, and it is one of the UFs by which the NOAEL can be divided to obtain the RfC/RfD. The 10-fold intraspecies UF is intended to account for uncertainty and variability in toxicokinetics (3.2x) and toxicodynamics (3.2x). This article deals exclusively with toxicokinetic component of UF. The framework provides an alternative to the default methodology and is advantageous in that the evaluation of toxicokinetic variability is based on the distribution of the effective target tissue dose, rather than applied dose. It allows for the replacement of the default adult and children intraspecies UF with toxicokinetic data-derived values and provides accurate chemical-specific estimates for their magnitude. It shows that proper application of probability and toxicokinetic theories can reduce uncertainties when establishing exposure limits for specific compounds and provide better assurance that established limits are adequately protective. It contributes to the development of a probabilistic noncancer risk assessment framework and will ultimately lead to the unification of cancer and noncancer risk assessment methodologies.  相似文献   

18.
The relative contribution of four influenza virus exposure pathways—(1) virus-contaminated hand contact with facial membranes, (2) inhalation of respirable cough particles, (3) inhalation of inspirable cough particles, and (4) spray of cough droplets onto facial membranes—must be quantified to determine the potential efficacy of nonpharmaceutical interventions of transmission. We used a mathematical model to estimate the relative contributions of the four pathways to infection risk in the context of a person attending a bed-ridden family member ill with influenza. Considering the uncertainties in the sparse human subject influenza dose-response data, we assumed alternative ratios of 3,200:1 and 1:1 for the infectivity of inhaled respirable virus to intranasally instilled virus. For the 3,200:1 ratio, pathways (1), (2), and (4) contribute substantially to influenza risk: at a virus saliva concentration of 106 mL−1, pathways (1), (2), (3), and (4) contribute, respectively, 31%, 17%, 0.52%, and 52% of the infection risk. With increasing virus concentrations, pathway (2) increases in importance, while pathway (4) decreases in importance. In contrast, for the 1:1 infectivity ratio, pathway (1) is the most important overall: at a virus saliva concentration of 106 mL−1, pathways (1), (2), (3), and (4) contribute, respectively, 93%, 0.037%, 3.3%, and 3.7% of the infection risk. With increasing virus concentrations, pathway (3) increases in importance, while pathway (4) decreases in importance. Given the sparse knowledge concerning influenza dose and infectivity via different exposure pathways, nonpharmaceutical interventions for influenza should simultaneously address potential exposure via hand contact to the face, inhalation, and droplet spray.  相似文献   

19.
Twenty-four-hour recall data from the Continuing Survey of Food Intake by Individuals (CSFII) are frequently used to estimate dietary exposure for risk assessment. Food frequency questionnaires are traditional instruments of epidemiological research; however, their application in dietary exposure and risk assessment has been limited. This article presents a probabilistic method of bridging the National Health and Nutrition Examination Survey (NHANES) food frequency and the CSFII data to estimate longitudinal (usual) intake, using a case study of seafood mercury exposures for two population subgroups (females 16 to 49 years and children 1 to 5 years). Two hundred forty-nine CSFII food codes were mapped into 28 NHANES fish/shellfish categories. FDA and state/local seafood mercury data were used. A uniform distribution with minimum and maximum blood-diet ratios of 0.66 to 1.07 was assumed. A probabilistic assessment was conducted to estimate distributions of individual 30-day average daily fish/shellfish intakes, methyl mercury exposure, and blood levels. The upper percentile estimates of fish and shellfish intakes based on the 30-day daily averages were lower than those based on two- and three-day daily averages. These results support previous findings that distributions of "usual" intakes based on a small number of consumption days provide overestimates in the upper percentiles. About 10% of the females (16 to 49 years) and children (1 to 5 years) may be exposed to mercury levels above the EPA's RfD. The predicted 75th and 90th percentile blood mercury levels for the females in the 16-to-49-year group were similar to those reported by NHANES. The predicted 90th percentile blood mercury levels for children in the 1-to-5-year subgroup was similar to NHANES and the 75th percentile estimates were slightly above the NHANES.  相似文献   

20.
A nonparametric estimator of the probability distribution of time-to-tumor is incorporated into an algorithm for calculating linearly extrapolated dosage limits from an animal carcino-genesis bioassay. The procedure is illustrated with tumor data from a mouse bioassay with 2-acetylaminofluorene. Extrapolated dosage limits for an excess risk of 10-6 differ by only a factor of 2 across the six replicates of the experiment.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号