首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
For noncancer effects, the degree of human interindividual variability plays a central role in determining the risk that can be expected at low exposures. This discussion reviews available data on observations of interindividual variability in (a) breathing rates, based on observations in British coal miners; (b) systemic pharmacokinetic parameters, based on studies of a number of drugs; (c) susceptibility to neurological effects from fetal exposure to methyl mercury, based on observations of the incidence of effects in relation to hair mercury levels; and (d) chronic lung function changes in relation to long-term exposure to cigarette smoke. The quantitative ranges of predictions that follow from uncertainties in estimates of interindividual variability in susceptibility are illustrated.  相似文献   

2.
Adam M. Finkel 《Risk analysis》2014,34(10):1785-1794
If exposed to an identical concentration of a carcinogen, every human being would face a different level of risk, determined by his or her genetic, environmental, medical, and other uniquely individual characteristics. Various lines of evidence indicate that this susceptibility variable is distributed rather broadly in the human population, with perhaps a factor of 25‐ to 50‐fold between the center of this distribution and either of its tails, but cancer risk assessment at the EPA and elsewhere has always treated every (adult) human as identically susceptible. The National Academy of Sciences “Silver Book” concluded that EPA and the other agencies should fundamentally correct their mis‐computation of carcinogenic risk in two ways: (1) adjust individual risk estimates upward to provide information about the upper tail; and (2) adjust population risk estimates upward (by about sevenfold) to correct an underestimation due to a mathematical property of the interindividual distribution of human susceptibility, in which the susceptibility averaged over the entire (right‐skewed) population exceeds the median value for the typical human. In this issue of Risk Analysis, Kenneth Bogen disputes the second adjustment and endorses the first, though he also relegates the problem of underestimated individual risks to the realm of “equity concerns” that he says should have little if any bearing on risk management policy. In this article, I show why the basis for the population risk adjustment that the NAS recommended is correct—that current population cancer risk estimates, whether they are derived from animal bioassays or from human epidemiologic studies, likely provide estimates of the median with respect to human variation, which in turn must be an underestimate of the mean. If cancer risk estimates have larger “conservative” biases embedded in them, a premise I have disputed in many previous writings, such a defect would not excuse ignoring this additional bias in the direction of underestimation. I also demonstrate that sensible, legally appropriate, and ethical risk policy must not only inform the public when the tail of the individual risk distribution extends into the “high‐risk” range, but must alter benefit‐cost balancing to account for the need to try to reduce these tail risks preferentially.  相似文献   

3.
Upperbound lifetime excess cancer risks were calculated for activities associated with asbestos abatement using a risk assessment framework developed for EPA's Superfund program. It was found that removals were associated with cancer risks to workers which were often greater than the commonly accepted cancer risk of 1 x 10(-6), although lower than occupational exposure limits associated with risks of 1 x 10(-3). Removals had little effect in reducing risk to school populations. Risks to teachers and students in school buildings containing asbestos were approximately the same as risks associated with exposure to ambient asbestos by the general public and were below the levels typically of concern to regulatory agencies. During abatement, however, there were increased risks to both workers and nearby individuals. Careless, everyday building maintenance generated the greatest risk to workers followed by removals and encapsulation. If asbestos abatement was judged by the risk criteria applied to EPA's Superfund program, the no-action alternative would likely be selected in preference to removal in a majority of cases. These conclusions should only be interpreted within the context of an overall asbestos risk management program, which includes consideration of specific fiber types and sizes, sampling and analytical limitations, physical condition of asbestos-containing material, episodic peak exposures, and the number of people potentially exposed.  相似文献   

4.
For diseases with more than one risk factor, the sum of probabilistic estimates of the number of cases caused by each individual factor may exceed the total number of cases observed, especially when uncertainties about exposure and dose response for some risk factors are high. In this study, we outline a method of bounding the fraction of lung cancer fatalities not due to specific well-studied causes. Such information serves as a "reality check" for estimates of the impacts of the minor risk factors, and, as such, complements the traditional risk analysis. With lung cancer as our example, we allocate portions of the observed lung cancer mortality to known causes (such as smoking, residential radon, and asbestos fibers) and describe the uncertainty surrounding those estimates. The interactions among the risk factors are also quantified, to the extent possible. We then infer an upper bound on the residual mortality due to "other" causes, using a consistency constraint on the total number of deaths, the maximum uncertainty principle, and the mathematics originally developed of imprecise probabilities.  相似文献   

5.
To assess the maximum possible impact of further government regulation of asbestos exposure, projections were made of the use of asbestos in nine product categories for the years 1985-2000. A life table risk assessment model was then developed to estimate the excess cases of cancer and lost person-years of life likely to occur among those occupationally and nonoccupationally exposed to the nine asbestos product categories manufactured in 1985-2000. These estimates were made under the assumption that government regulation remains at its 1985 level. Use of asbestos in the nine product categories was predicted to decline in all cases except for friction products. The risk assessment results show that, although the cancer risks from future exposure to asbestos are significantly less than those from past exposures, in the absence of more stringent regulations, a health risk remains.  相似文献   

6.
We review approaches for characterizing “peak” exposures in epidemiologic studies and methods for incorporating peak exposure metrics in dose–response assessments that contribute to risk assessment. The focus was on potential etiologic relations between environmental chemical exposures and cancer risks. We searched the epidemiologic literature on environmental chemicals classified as carcinogens in which cancer risks were described in relation to “peak” exposures. These articles were evaluated to identify some of the challenges associated with defining and describing cancer risks in relation to peak exposures. We found that definitions of peak exposure varied considerably across studies. Of nine chemical agents included in our review of peak exposure, six had epidemiologic data used by the U.S. Environmental Protection Agency (US EPA) in dose–response assessments to derive inhalation unit risk values. These were benzene, formaldehyde, styrene, trichloroethylene, acrylonitrile, and ethylene oxide. All derived unit risks relied on cumulative exposure for dose–response estimation and none, to our knowledge, considered peak exposure metrics. This is not surprising, given the historical linear no‐threshold default model (generally based on cumulative exposure) used in regulatory risk assessments. With newly proposed US EPA rule language, fuller consideration of alternative exposure and dose–response metrics will be supported. “Peak” exposure has not been consistently defined and rarely has been evaluated in epidemiologic studies of cancer risks. We recommend developing uniform definitions of “peak” exposure to facilitate fuller evaluation of dose response for environmental chemicals and cancer risks, especially where mechanistic understanding indicates that the dose response is unlikely linear and that short‐term high‐intensity exposures increase risk.  相似文献   

7.
Cancer risks for ethylene dibromide (EDB) were estimated by fitting several linear non-threshold additive models to data from a gavage bioassay. Risks predicted by these models were compared to the observed cancer mortality among a cohort of workers occupationally exposed to the same chemical. Models that accounted for the shortened latency period in the gavaged rats predicted upper bound risks that were within a factor of 3 of the observed cancer deaths. Data from an animal inhalation study of EDB also were compatible with the epidemiologic data. These findings contradict those of Ramsey et al. (1978), who reported that extrapolation from animal data produced highly exaggerated risk estimates for EDB-exposed workers. This paper explores the reasons for these discrepant findings.  相似文献   

8.
When high-dose tumor data are extrapolated to low doses, it is typically assumed that the dose of a carcinogen delivered to target cells is proportional to the dose administered to test animals, even at exposure levels below the experimental range. Since pharmacokinetic data are becoming available that in some cases question the validity of this assumption, risk assessors must decide whether to maintain the standard assumption. A pilot study of formaldehyde is reported that was undertaken to demonstrate how expert scientific judgment can help guide a controversial risk assessment where pharmacokinetic data are considered inconclusive. Eight experts on pharmacokinetic data were selected by a formal procedure, and each was interviewed personally using a structured interview protocol. The results suggest that expert scientific opinion is polarized in this case, a situation that risk assessors can respond to with a range of risk characterizations considered biologically plausible by the experts. Convergence of expert opinion is likely in this case of several specific research strategies ar executed in a competent fashion. Elicitation of expert scientific judgment is a promising vehicle for evaluating the quality of pharmacokinetic data, expressing uncertainty in risk assessment, and fashioning a research agenda that offers possible forging of scientific consensus.  相似文献   

9.
Following a comprehensive evaluation of the health risks of radon, the U.S. National Research Council (US-NRC) concluded that the radon inside the homes of U.S. residents is an important cause of lung cancer. To assess lung cancer risks associated with radon exposure in Canadian homes, we apply the new (US-NRC) techniques, tailoring assumptions to the Canadian context. A two-dimensional uncertainty analysis is used to provide both population-based (population attributable risk, PAR; excess lifetime risk ratio, ELRR; and life-years lost, LYL) and individual-based (ELRR and LYL) estimates. Our primary results obtained for the Canadian population reveal mean estimates for ELRR, PAR, and LYL are 0.08, 8%, and 0.10 years, respectively. Results are also available and stratified by smoking status (ever versus never). Conveniently, the three indices (ELRR, PAR, and LYL) reveal similar output uncertainty (geometric standard deviation, GSD approximately 1.3), and in the case of ELRR and LYL, comparable variability and uncertainty combined (GSD approximately 4.2). Simplifying relationships are identified between ELRR, LYL, PAR, and the age-specific excess rate ratio (ERR), which suggest a way to scale results from one population to another. This insight is applied in scaling our baseline results to obtain gender-specific estimates, as well as in simplifying and illuminating sensitivity analysis.  相似文献   

10.
Dioxin (2,3,7,8-tetrachlorodibenzo- p -dioxin; TCDD), a widespread polychlorinated aromatic hydrocarbon, caused tumors in the liver and other sites when administered chronically to rats at doses as low as 0.01 μg/kg/day. It functions in combination with a cellular protein, the Ah receptor, to alter gene regulation, and this resulting modulation of gene expression is believed to be obligatory for both dioxin toxicity and carcinogenicity. The U.S. EPA is reevaluating its dioxin risk assessment and, as part of this process, will be developing risk assessment approaches for chemicals, such as dioxin, whose toxicity is receptor-mediated. This paper describes a receptor-mediated physiologically based pharmacokinetic (PB-PK) model for the tissue distribution and enzyme-inducing properties of dioxin and discusses the potential role of these models in a biologically motivated risk assessment. In this model, ternary interactions among the Ah receptor, dioxin, and DNA binding sites lead to enhanced production of specific hepatic proteins. The model was used to examine the tissue disposition of dioxin and the induction of both a dioxin-binding protein (presumably, cytochrome P4501A2), and cytochrome P4501A1. Tumor promotion correlated more closely with predicted induction of P4501A1 than with induction of hepatic binding proteins. Although increased induction of these proteins is not expected to be causally related to tumor formation, these physiological dosimetry and gene-induction response models will be important for biologically motivated dioxin risk assessments in determining both target tissue dose of dioxin and gene products and in examining the relationship between these gene products and the cellular events more directly involved in tumor promotion.  相似文献   

11.
Approaches to risk assessment have been shown to vary among regulatory agencies and across jurisdictional boundaries according to the different assumptions and justifications used. Approaches to screening-level risk assessment from six international agencies were applied to an urban case study focusing on benzo[a]pyrene (B[a]P) exposure and compared in order to provide insight into the differences between agency methods, assumptions, and justifications. Exposure estimates ranged four-fold, with most of the dose stemming from exposure to animal products (8-73%) and plant products (24-88%). Total cancer risk across agencies varied by two orders of magnitude, with exposure to air and plant and animal products contributing most to total cancer risk, while the air contribution showed the greatest variability (1-99%). Variability in cancer risk of 100-fold was attributed to choices of toxicological reference values (TRVs), either based on a combination of epidemiological and animal data, or on animal data. The contribution and importance of the urban exposure pathway for cancer risk varied according to the TRV and, ultimately, according to differences in risk assessment assumptions and guidance. While all agency risk assessment methods are predicated on science, the study results suggest that the largest impact on the differential assessment of risk by international agencies comes from policy and judgment, rather than science.  相似文献   

12.
Typical exposures to lead often involve a mix of long-term exposures to relatively constant exposure levels (e.g., residential yard soil and indoor dust) and highly intermittent exposures at other locations (e.g., seasonal recreational visits to a park). These types of exposures can be expected to result in blood lead concentrations that vary on a temporal scale with the intermittent exposure pattern. Prediction of short-term (or seasonal) blood lead concentrations arising from highly variable intermittent exposures requires a model that can reliably simulate lead exposures and biokinetics on a temporal scale that matches that of the exposure events of interest. If exposure model averaging times (EMATs) of the model exceed the shortest exposure duration that characterizes the intermittent exposure, uncertainties will be introduced into risk estimates because the exposure concentration used as input to the model must be time averaged to account for the intermittent nature of the exposure. We have used simulation as a means of determining the potential magnitude of these uncertainties. Simulations using models having various EMATs have allowed exploration of the strengths and weaknesses of various approaches to time averaging of exposures and impact on risk estimates associated with intermittent exposures to lead in soil. The International Commission of Radiological Protection (ICRP) model of lead pharmacokinetics in humans simulates lead intakes that can vary in intensity over time spans as small as one day, allowing for the simulation of intermittent exposures to lead as a series of discrete daily exposure events. The ICRP model was used to compare the outcomes (blood lead concentration) of various time-averaging adjustments for approximating the time-averaged intake of lead associated with various intermittent exposure patterns. Results of these analyses suggest that standard approaches to time averaging (e.g., U.S. EPA) that estimate the long-term daily exposure concentration can, in some cases, result in substantial underprediction of short-term variations in blood lead concentrations when used in models that operate with EMATs exceeding the shortest exposure duration that characterizes the intermittent exposure. Alternative time-averaging approaches recommended for use in lead risk assessment more reliably predict short-term periodic (e.g., seasonal) elevations in blood lead concentration that might result from intermittent exposures. In general, risk estimates will be improved by simulation on shorter time scales that more closely approximate the actual temporal dynamics of the exposure.  相似文献   

13.
The improvement of food safety in the domestic environment requires a transdisciplinary approach, involving interaction between both the social and natural sciences. This approach is applied in a study on risks associated with Campylobacter on broiler meat. First, some web-based information interventions were designed and tested on participant motivation and intentions to cook more safely. Based on these self-reported measures, the intervention supported by the emotion "disgust" was selected as the most promising information intervention. Its effect on microbial cross-contamination was tested by recruiting a set of participants who prepared a salad with chicken breast fillet carrying a known amount of tracer bacteria. The amount of tracer that could be recovered from the salad revealed the transfer and survival of Campylobacter and was used as a measure of hygiene. This was introduced into an existing risk model on Campylobacter in the Netherlands to assess the effect of the information intervention both at the level of exposure and the level of human disease risk. We showed that the information intervention supported by the emotion "disgust" alone had no measurable effect on the health risk. However, when a behavioral cue was embedded within the instruction for the salad preparation, the risk decreased sharply. It is shown that a transdisciplinary approach, involving research on risk perception, microbiology, and risk assessment, is successful in evaluating the efficacy of an information intervention in terms of human health risks. The approach offers a novel tool for science-based risk management in the area of food safety.  相似文献   

14.
The goal of this study was to systematically evaluate the choices made in deriving a chronic oral noncancer human health reference value (HHRV) for a given chemical by different organizations, specifically those from the U.S. Environmental Protection Agency, Health Canada, RIVM (the Netherlands), and the U.S. Agency for Toxic Substances and Disease Registry. This analysis presents a methodological approach for comparing both the HHRVs and the specific choices made in the process of deriving an HHRV across these organizations. Overall, across the 96 unique chemicals and 171 two‐way organizational comparisons, the HHRV agreed approximately 26% of the time. A qualitative method for identifying the primary factors influencing these HHRV differences was also developed, using arrays of HHRVs across organizations for the same chemical. The primary factors identified were disagreement on the critical or principal study and differential application of the total uncertainty factor across organizations. Of the cases where the total UF was the primary factor influencing HHRV disagreement, the database UF had the greatest influence.  相似文献   

15.
A recent report by the National Academy of Sciences estimates that the radiation dose to the bronchial epithelium, per working level month (WLM) of radon daughter exposure, is about 30% lower for residential exposures than for exposures received in underground mines. Adjusting the previously published BEIR IV radon risk model accordingly, the unit risk for indoor exposures of the general population is about 2.2 x 10(-4) lung cancer deaths (lcd)/WLM. Using results from EPA's National Residential Radon Survey, the average radon level is estimated to be about 1.25 pCi/L, and the annual average exposure about 0.242 WLM. Based on these estimates, 13,600 radon-induced lcd/yr are projected for the United States. A quantitative uncertainty analysis was performed, which considers: statistical uncertainties in the epidemiological studies of radon-exposed miners; the dependence of risk on age at, and time since, exposure; the extrapolation of risk estimates from mines to homes based on comparative dosimetry; and uncertainties in the radon daughter levels in homes and in the average residential occupancy. Based on this assessment of the uncertainties in the unit risk and exposure estimates, an uncertainty range of 7000-30000 lcd/yr is derived.  相似文献   

16.
The main impetus to the development of information about major industrial hazards in the European Community comes from the so-called Seveso Directive, which defines an information network and requires the generation and transmission of information as the basis for accident prevention and risk management. This important policy development, which calls for the formal identification and analysis of major hazards and the communication of risk information to members of the public, presents new opportunities and challenges to risk analysis and research in Europe. This paper briefly reviews the accidents that gave rise to the Directive and shaped its content, and then summarizes its requirements. The status of its implementation in the EC Member States is discussed, with special emphasis given to the comparison of safety analysis practices, the Major Accident Reporting System (MARS), and risk communication. Some new research directions stimulated by the Directive are identified.  相似文献   

17.
If a specific biological mechanism could be determined by which a carcinogen increases lung cancer risk, how might this knowledge be used to improve risk assessment? To explore this issue, we assume (perhaps incorrectly) that arsenic in cigarette smoke increases lung cancer risk by hypermethylating the promoter region of gene p16INK4a, leading to a more rapid entry of altered (initiated) cells into a clonal expansion phase. The potential impact on lung cancer of removing arsenic is then quantified using a three‐stage version of a multistage clonal expansion (MSCE) model. This refines the usual two‐stage clonal expansion (TSCE) model of carcinogenesis by resolving its intermediate or “initiated” cell compartment into two subcompartments, representing experimentally observed “patch” and “field” cells. This refinement allows p16 methylation effects to be represented as speeding transitions of cells from the patch state to the clonally expanding field state. Given these assumptions, removing arsenic might greatly reduce the number of nonsmall cell lung cancer cells (NSCLCs) produced in smokers, by up to two‐thirds, depending on the fraction (between 0 and 1) of the smoking‐induced increase in the patch‐to‐field transition rate prevented if arsenic were removed. At present, this fraction is unknown (and could be as low as zero), but the possibility that it could be high (close to 1) cannot be ruled out without further data.  相似文献   

18.
We evaluate, for the U.S. case, the costs and benefits of three security measures designed to reduce the likelihood of a direct replication of the 9/11 terrorist attacks. To do so, we assess risk reduction, losses, and security costs in the context of the full set of security layers. The three measures evaluated are installed physical secondary barriers (IPSB) to restrict access to the hardened cockpit door during door transitions, the Federal Air Marshal Service (FAMS), and the Federal Flight Deck Officer (FFDO) Program. In the process, we examine an alternate policy measure: doubling the budget of the FFDO program to $44 million per year, installing IPSBs in all U.S. aircraft at a cost of $13.5 million per year, and reducing funding for FAMS by 75% to $300 million per year. A break‐even cost‐benefit analysis then finds the minimum probability of an otherwise successful attack required for the benefit of each security measures to equal its cost. We find that the IPSB is costeffective if the annual attack probability of an otherwise successful attack exceeds 0.5% or one attack every 200 years. The FFDO program is costeffective if the annual attack probability exceeds 2%. On the other hand, more than two otherwise successful attacks per year are required for FAMS to be costeffective. A policy that includes IPSBs, an increased budget for FFDOs, and a reduced budget for FAMS may be a viable policy alternative, potentially saving hundreds of millions of dollars per year with consequences for security that are, at most, negligible.  相似文献   

19.
《Risk analysis》1996,16(6):841-848
Currently, risk assessments of the potential human health effects associated with exposure to pathogens are utilizing the conceptual framework that was developed to assess risks associated with chemical exposures. However, the applicability of the chemical framework is problematic due to many issues that are unique to assessing risks associated with pathogens. These include, but are not limited to, an assessment of pathogen/host interactions, consideration of secondary spread, consideration of short- and long-term immunity, and an assessment of conditions that allow the microorganism to propagate. To address this concern, a working group was convened to develop a conceptual framework to assess the risks of human disease associated with exposure to pathogenic microorganisms. The framework that was developed consists of three phases: problem formulation, analysis (which includes characterization of exposure and human health effects), and risk characterization. The framework emphasizes the dynamic and iterative nature of the risk assessment process, and allows wide latitude for planning and conducting risk assessments in diverse situations, each based on the common principles discussed in the framework.  相似文献   

20.
The communication and regulation of risk has changed significantly over the past 30 years in Europe and to a noticeable but lesser extent in the United States. In Europe, this is partly due to a series of regulatory mishaps, ranging from mad cow disease in the United Kingdom to contamination of the blood supply in France. In the United States, general public confidence in the American government has been gradually declining for more than three decades, driven by a mix of cultural and political conflicts like negative political advertising, a corrosive news media, and cuts in regulatory budgets. While the former approach is based on an objective assessment of the risk, the latter is driven more by the perception of the risk, consumer sentiment, political will, and sectoral advocacy. In this article, the author examines three U.S.‐based food case studies (acrylamide, bisphenol A, and artificial food colorings) where regulations at the local and state levels are increasingly being based on perceived risk advocacy rather than on the most effective response to the risk, be it to food safety or public health, as defined by regulatory interpretation of existing data. In the final section, the author puts forward a series of recommendations for how U.S.‐based regulators can best handle those situations where the perceived risk is markedly different from the fact‐based risk, such as strengthening the communication departments of food regulatory agencies, training officials in risk communication, and working more proactively with neutral third‐party experts.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号