首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Probabilistic risk assessments are enjoying increasing popularity as a tool to characterize the health hazards associated with exposure to chemicals in the environment. Because probabilistic analyses provide much more information to the risk manager than standard “point” risk estimates, this approach has generally been heralded as one which could significantly improve the conduct of health risk assessments. The primary obstacles to replacing point estimates with probabilistic techniques include a general lack of familiarity with the approach and a lack of regulatory policy and guidance. This paper discusses some of the advantages and disadvantages of the point estimate vs. probabilistic approach. Three case studies are presented which contrast and compare the results of each. The first addresses the risks associated with household exposure to volatile chemicals in tapwater. The second evaluates airborne dioxin emissions which can enter the food-chain. The third illustrates how to derive health-based cleanup levels for dioxin in soil. It is shown that, based on the results of Monte Carlo analyses of probability density functions (PDFs), the point estimate approach required by most regulatory agencies will nearly always overpredict the risk for the 95th percentile person by a factor of up to 5. When the assessment requires consideration of 10 or more exposure variables, the point estimate approach will often predict risks representative of the 99.9th percentile person rather than the 50th or 95th percentile person. This paper recommends a number of data distributions for various exposure variables that we believe are now sufficiently well understood to be used with confidence in most exposure assessments. A list of exposure variables that may require additional research before adequate data distributions can be developed are also discussed.  相似文献   

2.
Workplace exposures to airborne chemicals are regulated in the U.S. by the Occupational Safety and Health Administration (OSHA) via the promulgation of permissible exposure limits (PELs). These limits, usually defined as eight-hour time-weighted average values, are enforced as concentrations never to be exceeded. In the case of chronic or delayed toxicants, the PEL is determined from epidemiological evidence and/or quantitative risk assessments based on long-term mean exposures or, equivalently, cumulative lifetime exposures. A statistical model was used to investigate the relation between the compliance strategy, the PEL as a limit never to be exceeded, and the health risk as measured by the probability that an individual's long-term mean exposure concentration is above the PEL. The model incorporates within-worker and between-worker variability in exposure, and assumes the relevant distributions to be log-normal. When data are inadequate to estimate the parameters of the full model, as it is in compliance inspections, it is argued that the probability of a random measurement being above the PEL must be regarded as a lower bound on the probability that a randomly selected worker's long-term mean exposure concentration will exceed the PEL. It is concluded that OSHA's compliance strategy is a reasonable, as well as a practical, means of limiting health risk for chronic or delayed toxicants.  相似文献   

3.
Health risk assessments have become so widely accepted in the United States that their conclusions are a major factor in many environmental decisions. Although the risk assessment paradigm is 10 years old, the basic risk assessment process has been used by certain regulatory agencies for nearly 40 years. Each of the four components of the paradigm has undergone significant refinements, particularly during the last 5 years. A recent step in the development of the exposure assessment component can be found in the 1992 EPA Guidelines for Exposure Assessment. Rather than assuming worst-case or hypothetical maximum exposures, these guidelines are designed to lead to an accurate characterization, making use of a number of scientific advances. Many exposure parameters have become better defined, and more sensitive techniques now exist for measuring concentrations of contaminants in the environnment. Statistical procedures for characterizing variability, using Monte Carlo or similar approaches, eliminate the need to select point estimates for all individual exposure parameters. These probabilistic models can more accurately characterize the full range of exposures that may potentially be encountered by a given population at a particular site, reducing the need to select highly conservative values to account for this form of uncertainty in the exposure estimate. Lastly, our awareness of the uncertainties in the exposure assessment as well as our knowledge as to how best to characterize them will almost certainly provide evaluations that will be more credible and, therein, more useful to risk managers. If these refinements are incorporated into future exposure assessments, it is likely that our resources will be devoted to problems that, when resolved, will yield the largest improvement in public health.  相似文献   

4.
The public health community, news media, and members of the general public have expressed significant concern that methicillin‐resistant Staphylococcus aureus (MRSA) transmitted from pigs to humans may harm human health. Studies of the prevalence and dynamics of swine‐associated (ST398) MRSA have sampled MRSA at discrete points in the presumed causative chain leading from swine to human patients, including sampling bacteria from live pigs, retail meats, farm workers, and hospital patients. Nonzero prevalence is generally interpreted as indicating a potential human health hazard from MRSA infections, but quantitative assessments of resulting risks are not usually provided. This article integrates available data from several sources to construct a conservative (plausible upper bound) probability estimate for the actual human health harm (MRSA infections and fatalities) arising from ST398‐MRSA from pigs. The model provides plausible upper bounds of approximately one excess human infection per year among all U.S. pig farm workers, and one human infection per 31 years among the remaining total population of the United States. These results assume the possibility of transmission events not yet observed, so additional data collection may reduce these estimates further.  相似文献   

5.
Estimated Soil Ingestion Rates for Use in Risk Assessment   总被引:2,自引:0,他引:2  
Assessing the risks to human health posed by contaminants present in soil requires an estimate of likely soil ingestion rates. In the past, direct measurements of soil ingestion were not available and risk assessors were forced to estimate soil ingestion rates based on observations of mouthing behavior and measurements of soil on hands. Recently, empirical data on soil ingestion rates have become available from two sources (Binder et al., 1986 and van Wijnen et al., 1986). Although preliminary, these data can be used to derive better estimates of soil ingestion rates for use in risk assessments. Estimates of average soil ingestion rates derived in this paper range from 25 to 100 mg/day, depending on the age of the individual at risk. Maximum soil ingestion rates that are unlikely to underestimate exposure range from 100 to 500 mg. A value of 5,000 mg/day is considered a reasonable estimate of a maximum single-day exposure for a child with habitual pica.  相似文献   

6.
Recent headlines and scientific articles projecting significant human health benefits from changes in exposures too often depend on unvalidated subjective expert judgments and modeling assumptions, especially about the causal interpretation of statistical associations. Some of these assessments are demonstrably biased toward false positives and inflated effects estimates. More objective, data‐driven methods of causal analysis are available to risk analysts. These can help to reduce bias and increase the credibility and realism of health effects risk assessments and causal claims. For example, quasi‐experimental designs and analysis allow alternative (noncausal) explanations for associations to be tested, and refuted if appropriate. Panel data studies examine empirical relations between changes in hypothesized causes and effects. Intervention and change‐point analyses identify effects (e.g., significant changes in health effects time series) and estimate their sizes. Granger causality tests, conditional independence tests, and counterfactual causality models test whether a hypothesized cause helps to predict its presumed effects, and quantify exposure‐specific contributions to response rates in differently exposed groups, even in the presence of confounders. Causal graph models let causal mechanistic hypotheses be tested and refined using biomarker data. These methods can potentially revolutionize the study of exposure‐induced health effects, helping to overcome pervasive false‐positive biases and move the health risk assessment scientific community toward more accurate assessments of the impacts of exposures and interventions on public health.  相似文献   

7.
Topics in Microbial Risk Assessment: Dynamic Flow Tree Process   总被引:5,自引:0,他引:5  
Microbial risk assessment is emerging as a new discipline in risk assessment. A systematic approach to microbial risk assessment is presented that employs data analysis for developing parsimonious models and accounts formally for the variability and uncertainty of model inputs using analysis of variance and Monte Carlo simulation. The purpose of the paper is to raise and examine issues in conducting microbial risk assessments. The enteric pathogen Escherichia coli O157:H7 was selected as an example for this study due to its significance to public health. The framework for our work is consistent with the risk assessment components described by the National Research Council in 1983 (hazard identification; exposure assessment; dose-response assessment; and risk characterization). Exposure assessment focuses on hamburgers, cooked a range of temperatures from rare to well done, the latter typical for fast food restaurants. Features of the model include predictive microbiology components that account for random stochastic growth and death of organisms in hamburger. For dose-response modeling, Shigella data from human feeding studies were used as a surrogate for E. coli O157:H7. Risks were calculated using a threshold model and an alternative nonthreshold model. The 95% probability intervals for risk of illness for product cooked to a given internal temperature spanned five orders of magnitude for these models. The existence of even a small threshold has a dramatic impact on the estimated risk.  相似文献   

8.
A GIS-Based Framework for Hazardous Materials Transport Risk Assessment   总被引:2,自引:0,他引:2  
This article presents a methodology for assessment of the hazardous materials transport risk in a multicommodity, multiple origin-destination setting. The proposed risk assessment methodology was integrated with a Geographical Information System (GIS), which made large-scale implementation possible. A GIS-based model of the truck shipments of dangerous goods via the highway network of Quebec and Ontario was developed. Based on the origin and destination of each shipment, the risk associated with the routes that minimize (1) the transport distance, (2) the population exposure, (3) the expected number of people to be evacuated in case of an incident, and (4) the probability of an incident during transportation was evaluated. Using these assessments, a government agency can estimate the impact of alternative policies that could alter the carriers' route choices. A related issue is the spatial distribution of transport risk, because an unfair distribution is likely to cause public concern. Thus, an analysis of transport risk equity in the provinces of Quebec and Ontario is also provided.  相似文献   

9.
This paper demonstrates a new methodology for probabilistic public health risk assessment using the first-order reliability method. The method provides the probability that incremental lifetime cancer risk exceeds a threshold level, and the probabilistic sensitivity quantifying the relative impact of considering the uncertainty of each random variable on the exceedance probability. The approach is applied to a case study given by Thompson et al. (1) on cancer risk caused by ingestion of benzene-contaminated soil, and the results are compared to that of the Monte Carlo method. Parametric sensitivity analyses are conducted to assess the sensitivity of the probabilistic event with respect to the distribution parameters of the basic random variables, such as the mean and standard deviation. The technique is a novel approach to probabilistic risk assessment, and can be used in situations when Monte Carlo analysis is computationally expensive, such as when the simulated risk is at the tail of the risk probability distribution.  相似文献   

10.
Risks from exposure to contaminated land are often assessed with the aid of mathematical models. The current probabilistic approach is a considerable improvement on previous deterministic risk assessment practices, in that it attempts to characterize uncertainty and variability. However, some inputs continue to be assigned as precise numbers, while others are characterized as precise probability distributions. Such precision is hard to justify, and we show in this article how rounding errors and distribution assumptions can affect an exposure assessment. The outcome of traditional deterministic point estimates and Monte Carlo simulations were compared to probability bounds analyses. Assigning all scalars as imprecise numbers (intervals prescribed by significant digits) added uncertainty to the deterministic point estimate of about one order of magnitude. Similarly, representing probability distributions as probability boxes added several orders of magnitude to the uncertainty of the probabilistic estimate. This indicates that the size of the uncertainty in such assessments is actually much greater than currently reported. The article suggests that full disclosure of the uncertainty may facilitate decision making in opening up a negotiation window. In the risk analysis process, it is also an ethical obligation to clarify the boundary between the scientific and social domains.  相似文献   

11.
Compliance Versus Risk in Assessing Occupational Exposures   总被引:1,自引:0,他引:1  
Assessments of occupational exposures to chemicals are generally based upon the practice of compliance testing in which the probability of compliance is related to the exceedance [γ, the likelihood that any measurement would exceed an occupational exposure limit (OEL)] and the number of measurements obtained. On the other hand, workers’ chronic health risks generally depend upon cumulative lifetime exposures which are not directly related to the probability of compliance. In this paper we define the probability of “overexposure” (θ) as the likelihood that individual risk (a function of cumulative exposure) exceeds the risk inherent in the OEL (a function of the OEL and duration of exposure). We regard θ as a relevant measure of individual risk for chemicals, such as carcinogens, which produce chronic effects after long-term exposures but not necessarily for acutely-toxic substances which can produce effects relatively quickly. We apply a random-effects model to data from 179 groups of workers, exposed to a variety of chemical agents, and obtain parameter estimates for the group mean exposure and the within- and between-worker components of variance. These estimates are then combined with OELs to generate estimates of γ and θ. We show that compliance testing can significantly underestimate the health risk when sample sizes are small. That is, there can be large probabilities of compliance with typical sample sizes, despite the fact that large proportions of the working population have individual risks greater than the risk inherent in the OEL. We demonstrate further that, because the relationship between θ and γ depends upon the within- and between-worker components of variance, it cannot be assumed a priori that exceedance is a conservative surrogate for overexposure. Thus, we conclude that assessment practices which focus upon either compliance or exceedance are problematic and recommend that employers evaluate exposures relative to the probabilities of overexposure.  相似文献   

12.
Researchers have long recognized that subjective perceptions of risk are better predictors of choices over risky outcomes than science‐based or experts’ assessments of risk. More recent work suggests that uncertainty about risks also plays a role in predicting choices and behavior. In this article, we develop and estimate a formal model for an individual's perceived health risks associated with arsenic contamination of his or her drinking water. The modeling approach treats risk as a random variable, with an estimable probability distribution whose variance reflects uncertainty. The model we estimate uses data collected from a survey given to a sample of people living in arsenic‐prone areas in the United States. The findings from this article support the fact that scientific information is essential to explaining the mortality rate perceived by the individuals, but uncertainty about the probability remains significant.  相似文献   

13.
A method is proposed for integrated probabilistic risk assessment where exposure assessment and hazard characterization are both included in a probabilistic way. The aim is to specify the probability that a random individual from a defined (sub)population will have an exposure high enough to cause a particular health effect of a predefined magnitude, the critical effect size ( CES ). The exposure level that results in exactly that CES in a particular person is that person's individual critical effect dose ( ICED ). Individuals in a population typically show variation, both in their individual exposure ( IEXP ) and in their ICED . Both the variation in IEXP and the variation in ICED are quantified in the form of probability distributions. Assuming independence between both distributions, they are combined (by Monte Carlo) into a distribution of the individual margin of exposure ( IMoE ). The proportion of the IMoE distribution below unity is the probability of critical exposure ( PoCE ) in the particular (sub)population. Uncertainties involved in the overall risk assessment (i.e., both regarding exposure and effect assessment) are quantified using Monte Carlo and bootstrap methods. This results in an uncertainty distribution for any statistic of interest, such as the probability of critical exposure ( PoCE ). The method is illustrated based on data for the case of dietary exposure to the organophosphate acephate. We present plots that concisely summarize the probabilistic results, retaining the distinction between variability and uncertainty. We show how the relative contributions from the various sources of uncertainty involved may be quantified.  相似文献   

14.
Hwang  Jing-Shiang  Chen  James J. 《Risk analysis》1999,19(6):1071-1076
The estimation of health risks from exposure to a mixture of chemical carcinogens is generally based on the combination of information from several available single compound studies. The current practice of directly summing the upper bound risk estimates of individual carcinogenic components as an upper bound on the total risk of a mixture is known to be generally too conservative. Gaylor and Chen (1996, Risk Analysis) proposed a simple procedure to compute an upper bound on the total risk using only the upper confidence limits and central risk estimates of individual carcinogens. The Gaylor-Chen procedure was derived based on an underlying assumption of the normality for the distributions of individual risk estimates. In this paper we evaluated the Gaylor-Chen approach in terms of the coverage probability. The performance of the Gaylor-Chen approach in terms the coverages of the upper confidence limits on the true risks of individual carcinogens. In general, if the coverage probabilities for the individual carcinogens are all approximately equal to the nominal level, then the Gaylor-Chen approach should perform well. However, the Gaylor-Chen approach can be conservative or anti-conservative if some or all individual upper confidence limit estimates are conservative or anti-conservative.  相似文献   

15.
Route-to-Route Extrapolation of the Toxic Potency of MTBE   总被引:1,自引:0,他引:1  
MTBE is a volatile organic compound used as an oxygenating agent in gasoline. Inhalation from fumes while refueling automobiles is the principle route of exposure for humans, and toxicity by this route has been well studied. Oral exposures to MTBE exist as well, primarily due to ground-water contamination from leaking stationary sources, such as underground storage tanks. Assessing the potential public health impacts of oral exposures to MTBE is problematic because drinking water studies do not exist for MTBE, and the few oil-gavage studies from which a risk assessment could be derived are limited. This paper evaluates the suitability of the MTBE database for conducting an inhalation route-to-oral route extrapolation of toxicity. This includes evaluating the similarity of critical effect between these two routes, quantifiable differences in absorption, distribution, metabolism, and excretion, and sufficiency of toxicity data by the inhalation route. We conclude that such an extrapolation is appropriate and have validated the extrapolation by finding comparable toxicity between a subchronic gavage oral bioassay and oral doses we extrapolate from a subchronic inhalation bioassay. Our results are extended to the 2-year inhalation toxicity study by Chun et al. (1992) in which rats were exposed to 0, 400, 3000, or 8000 ppm MTBE for 6 hr/d, 5 d/wk. We have estimated the equivalent oral doses to be 0, 130, 940, or 2700 mg/kg/d. These equivalent doses may be useful in conducting noncancer and cancer risk assessments.  相似文献   

16.
A radiological dispersion device (RDD) or "dirty" bomb is a conventional explosive wrapped in radiological material. Terrorists may use an RDD to disperse radioactive material across a populated area, causing casualties and/or economic damage. Nearly all risk assessment models for RDDs make unrealistic assumptions about public behavior in their health assessments, including assumptions that the public would stand outside in a single location indefinitely. In this article, we describe an approach for assessing the risks of RDD events incorporating both physical dispersion and behavioral response variables. The general approach is tested using the City of Pittsburgh, Pennsylvania as a case study. Atmospheric models simulate an RDD attack and its likely fallout, while radiation exposure models assess fatal cancer risk. We model different geographical distributions of the population based on time of day. We evaluate aggregate health impacts for different public responses (i.e., sheltering-in-place, evacuating). We find that current RDD models in use can be improved with the integration of behavioral components. Using the results from the model, we show how risk varies across several behavioral and physical variables. We show that the best policy to recommend to the public depends on many different variables, such as the amount of trauma at ground zero, the capability of emergency responders to get trauma victims to local hospitals quickly and efficiently, how quickly evacuations can take place in the city, and the amount of shielding available for shelterers. Using a parametric analysis, we develop behaviorally realistic risk assessments, we identify variables that can affect an optimal risk reduction policy, and we find that decision making can be improved by evaluating the tradeoff between trauma and cancer fatalities for various RDD scenarios before they occur.  相似文献   

17.
Fish consumption rates play a critical role in the assessment of human health risks posed by the consumption of fish from chemically contaminated water bodies. Based on data from the 1989 Michigan Sport Anglers Fish Consumption Survey, we examined total fish consumption, consumption of self-caught fish, and consumption of Great Lakes fish for all adults, men, women, and certain higher risk subgroups such as anglers. We present average daily consumption rates as compound probability distributions consisting of a Bernoulli trial (to distinguish those who ate fish from those who did not) combined with a distribution (both empirical and parametric) for those who ate fish. We found that the average daily consumption rates for adults who ate fish are reasonably well fit by lognormal distributions. The compound distributions may be used as input variables for Monte Carlo simulations in public health risk assessments.  相似文献   

18.
Quantitative risk assessment often begins with an estimate of the exposure or dose associated with a particular risk level from which exposure levels posing low risk to populations can be extrapolated. For continuous exposures, this value, the benchmark dose, is often defined by a specified increase (or decrease) from the median or mean response at no exposure. This method of calculating the benchmark dose does not take into account the response distribution and, consequently, cannot be interpreted based upon probability statements of the target population. We investigate quantile regression as an alternative to the use of the median or mean regression. By defining the dose–response quantile relationship and an impairment threshold, we specify a benchmark dose as the dose associated with a specified probability that the population will have a response equal to or more extreme than the specified impairment threshold. In addition, in an effort to minimize model uncertainty, we use Bayesian monotonic semiparametric regression to define the exposure–response quantile relationship, which gives the model flexibility to estimate the quantal dose–response function. We describe this methodology and apply it to both epidemiology and toxicology data.  相似文献   

19.
In Part 1 of this article we developed an approach for the calculation of cancer effect measures for life cycle assessment (LCA). In this article, we propose and evaluate the method for the screening of noncancer toxicological health effects. This approach draws on the noncancer health risk assessment concept of benchmark dose, while noting important differences with regulatory applications in the objectives of an LCA study. We adopt the centraltendency estimate of the toxicological effect dose inducing a 10% response over background, ED10, to provide a consistent point of departure for default linear low-dose response estimates (betaED10). This explicit estimation of low-dose risks, while necessary in LCA, is in marked contrast to many traditional procedures for noncancer assessments. For pragmatic reasons, mechanistic thresholds and nonlinear low-dose response curves were not implemented in the presented framework. In essence, for the comparative needs of LCA, we propose that one initially screens alternative activities or products on the degree to which the associated chemical emissions erode their margins of exposure, which may or may not be manifested as increases in disease incidence. We illustrate the method here by deriving the betaED10 slope factors from bioassay data for 12 chemicals and outline some of the possibilities for extrapolation from other more readily available measures, such as the no observable adverse effect levels (NOAEL), avoiding uncertainty factors that lead to inconsistent degrees of conservatism from chemical to chemical. These extrapolations facilitated the initial calculation of slope factors for an additional 403 compounds; ranging from 10(-6) to 10(3) (risk per mg/kg-day dose). The potential consequences of the effects are taken into account in a preliminary approach by combining the betaED10 with the severity measure disability adjusted life years (DALY), providing a screening-level estimate of the potential consequences associated with exposures, integrated over time and space, to a given mass of chemical released into the environment for use in LCA.  相似文献   

20.
Various methods for risk characterization have been developed using probabilistic approaches. Data on Vietnamese farmers are available for the comparison of outcomes for risk characterization using different probabilistic methods. This article addresses the health risk characterization of chlorpyrifos using epidemiological dose‐response data and probabilistic techniques obtained from a case study with rice farmers in Vietnam. Urine samples were collected from farmers and analyzed for trichloropyridinol (TCP), which was converted into absorbed daily dose of chlorpyrifos. Adverse health response doses due to chlorpyrifos exposure were collected from epidemiological studies to develop dose‐adverse health response relationships. The health risk of chlorpyrifos was quantified using hazard quotient (HQ), Monte Carlo simulation (MCS), and overall risk probability (ORP) methods. With baseline (prior to pesticide spraying) and lifetime exposure levels (over a lifetime of pesticide spraying events), the HQ ranged from 0.06 to 7.1. The MCS method indicated less than 0.05% of the population would be affected while the ORP method indicated that less than 1.5% of the population would be adversely affected. With postapplication exposure levels, the HQ ranged from 1 to 32.5. The risk calculated by the MCS method was that 29% of the population would be affected, and the risk calculated by ORP method was 33%. The MCS and ORP methods have advantages in risk characterization due to use of the full distribution of data exposure as well as dose response, whereas HQ methods only used the exposure data distribution. These evaluations indicated that single‐event spraying is likely to have adverse effects on Vietnamese rice farmers.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号