首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
General patterns of bias in risk beliefs are well established in the literature, but much less is known about how these biases vary across the population. Using a sample of almost 500 people, the regression analysis in this article yields results consistent with the well-established pattern that small risks are overassessed and large risks are underassessed. The accuracy of these risk beliefs varies across demographic factors, as does the switch point at which people go from underassessment to overassessment, which we found to be 1500 deaths annually for the full sample. Better educated people have more accurate risk beliefs, and there are important differences in the risk perception by race and gender that also may be of policy interest.  相似文献   

2.
Aggregate exposure metrics based on sums or weighted averages of component exposures are widely used in risk assessments of complex mixtures, such as asbestos-associated dusts and fibers. Allowed exposure levels based on total particle or fiber counts and estimated ambient concentrations of such mixtures may be used to make costly risk-management decisions intended to protect human health and to remediate hazardous environments. We show that, in general, aggregate exposure information alone may be inherently unable to guide rational risk-management decisions when the components of the mixture differ significantly in potency and when the percentage compositions of the mixture exposures differ significantly across locations. Under these conditions, which are not uncommon in practice, aggregate exposure metrics may be "worse than useless," in that risk-management decisions based on them are less effective than decisions that ignore the aggregate exposure information and select risk-management actions at random. The potential practical significance of these results is illustrated by a case study of 27 exposure scenarios in El Dorado Hills, California, where applying an aggregate unit risk factor (from EPA's IRIS database) to aggregate exposure metrics produces average risk estimates about 25 times greater - and of uncertain predictive validity - compared to risk estimates based on specific components of the mixture that have been hypothesized to pose risks of human lung cancer and mesothelioma.  相似文献   

3.
In attempts to soothe the nascent fear of the scheduled airline traveler, passengers waiting takeoff are sometimes reminded of the cliche that they may have already completed the most dangerous part of their trip — the drive to the airport. The objective of this paper is to communicate under what conditions air travel is indeed safer than highway travel and vice versa. The conventional wisdom among risk communicators that air travel is so much safer than car travel arises from the most widely quoted death rates per billion miles for each — 0.6 for air compared to 24 for road. There are three reasons why such an unqualified comparison of aggregated fatality rates is inappropriate. First, the airline rate is passenger fatalities per passenger mile, whereas the road rate is all fatalities (any occupants, pedestrians, etc.) per vehicle mile. Second, road travel that competes with air travel is on the rural interstate system, not on average roads. Third, driver and vehicle characteristics, and driver behavior, lead to car-driver risks that vary over a wide range. Expressions derived to compare risk for drivers with given characteristics to those on airline trips of given distance showed that 40-year-old, belted, alcohol-free drivers of cars 700 pounds heavier than average are slightly less likely to be killed in 600 miles of rural interstate driving than in airline trips of the same length. Compared to this driver, 18-year-old, unbelted, intoxicated, male drivers of cars 700 pounds lighter than average have a risk over 1000 times greater. Furthermore, it is shown that the cliche above is untrue for a group of drivers having the age distribution of airline passengers.  相似文献   

4.
Risks associated with toxicants in food are often controlled by exposure reduction. When exposure recommendations are developed for foods with both harmful and beneficial qualities, however, they must balance the associated risks and benefits to maximize public health. Although quantitative methods are commonly used to evaluate health risks, such methods have not been generally applied to evaluating the health benefits associated with environmental exposures. A quantitative method for risk-benefit analysis is presented that allows for consideration of diverse health endpoints that differ in their impact (i.e., duration and severity) using dose-response modeling weighted by quality-adjusted life years saved. To demonstrate the usefulness of this method, the risks and benefits of fish consumption are evaluated using a single health risk and health benefit endpoint. Benefits are defined as the decrease in myocardial infarction mortality resulting from fish consumption, and risks are defined as the increase in neurodevelopmental delay (i.e., talking) resulting from prenatal methylmercury exposure. Fish consumption rates are based on information from Washington State. Using the proposed framework, the net health impact of eating fish is estimated in either a whole population or a population consisting of women of childbearing age and their children. It is demonstrated that across a range of fish methylmercury concentrations (0-1 ppm) and intake levels (0-25 g/day), individuals would have to weight the neurodevelopmental effects 6 times more (in the whole population) or 250 times less (among women of child-bearing age and their children) than the myocardial infarction benefits in order to be ambivalent about whether or not to consume fish. These methods can be generalized to evaluate the merits of other public health and risk management programs that involve trade-offs between risks and benefits.  相似文献   

5.
Biologic data on benzene metabolite doses, cytotoxicity, and genotoxicity often show that these effects do not vary directly with cumulative benzene exposure (i.e., concentration times time, or c × t ). To examine the effect of an alternate exposure metric, we analyzed cell-type specific leukemia mortality in Pliofilm workers. The work history of each Pliofilm worker was used to define each worker's maximally exposed job/department combination over time and the associated long-term average concentration associated with the maximally exposed job (LTA-MEJ). Using this measure, in conjunction with four job exposure estimates, we calculated SMRs for groups of workers with increasing LTA-MEJs. The analyses suggest that a critical concentration of benzene exposure must be reached in order for the risk of leukemia or, more specifically, AMML to be expressed. The minimum concentration is between 20 and 60 ppm depending on the exposure estimate and endpoint (all leukemias or AMMLs only). We believe these analyses are a useful adjunct to previous analyses of the Pliofilm data. They suggests that (a) AMML risk is shown only above a critical concentration of benzene exposure, measured as a long-term average and experienced for years, (b) the critical concentration is between 50 and 60 ppm when using a median exposure estimate derived from three previous exposure assessments, and is between 20 and 25 ppm using the lowest exposure estimates, and (c) risks for total leukemia are driven by risks for AMML, suggesting that AMML is the cell type related to benzene exposure.  相似文献   

6.
This article presents an analysis of postattack response strategies to mitigate the risks of reoccupying contaminated areas following a release of Bacillus anthracis spores (the bacterium responsible for causing anthrax) in an urban setting. The analysis is based on a hypothetical attack scenario in which individuals are exposed to B. anthracis spores during an initial aerosol release and then placed on prophylactic antibiotics that successfully protect them against the initial aerosol exposure. The risk from reoccupying buildings contaminated with spores due to their reaerosolization and inhalation is then evaluated. The response options considered include: decontamination of the buildings, vaccination of individuals reoccupying the buildings, extended evacuation of individuals from the contaminated buildings, and combinations of these options. The study uses a decision tree to estimate the costs and benefits of alternative response strategies across a range of exposure risks. Results for best estimates of model inputs suggest that the most cost‐effective response for high‐risk scenarios (individual chance of infection exceeding 11%) consists of evacuation and building decontamination. For infection risks between 4% and 11%, the preferred option is to evacuate for a short period, vaccinate, and then reoccupy once the vaccine has taken effect. For risks between 0.003% and 4%, the preferred option is to vaccinate only. For risks below 0.003%, none of the mitigation actions have positive expected monetary benefits. A sensitivity analysis indicates that for high‐infection‐likelihood scenarios, vaccination is recommended in the case where decontamination efficacy is less than 99.99%.  相似文献   

7.
The relatively high failure rates, with important consequences in many cases, suggest that the implicitly acceptable risk levels corresponding to temporary civil engineering structures and activities might exceed the bounds of normally acceptable levels associated with different societal activities. Among other reasons, this may be attributed to the lack of a rational approach for the assessment of risks associated with the different technologies supporting these activities in general, and for structures in particular. There is a need for establishing appropriate target reliability levels for structures under temporary use taking into account specific circumstances such as reduced risk exposure times. This issue is being addressed in this article. Acceptance criteria for building-structure-related risks to persons obtained in prior studies are adapted to the special circumstances of nonpermanent risk exposure. Thereby, the general principle followed is to maintain the same risk levels per time unit as for permanently occupied buildings. The adaptation is based on the statistical annual fatality rate, a life safety risk metric that allows for a consistent comparison of risks across different societal activities and technologies. It is shown that the target reliability indices taking account of the temporary use of buildings might be significantly higher than the values suggested for permanently used structures.  相似文献   

8.
Access management, which systematically limits opportunities for egress and ingress of vehicles to highway lanes, is critical to protect trillions of dollars of current investment in transportation. This article addresses allocating resources for access management with incomplete and partially relevant data on crash rates, travel speeds, and other factors. While access management can be effective to avoid crashes, reduce travel times, and increase route capacities, the literature suggests a need for performance metrics to guide investments in resource allocation across large corridor networks and several time horizons. In this article, we describe a quantitative decision model to support an access management program via risk‐cost‐benefit analysis under data uncertainties from diverse sources of data and expertise. The approach quantifies potential benefits, including safety improvement and travel time savings, and costs of access management through functional relationships of input parameters including crash rates, corridor access point densities, and traffic volumes. Parameter uncertainties, which vary across locales and experts, are addressed via numerical interval analyses. This approach is demonstrated at several geographic scales across 7,000 kilometers of highways in a geographic region and several subregions. The demonstration prioritizes route segments that would benefit from risk management, including (i) additional data or elicitation, (ii) right‐of‐way purchases, (iii) restriction or closing of access points, (iv) new alignments, (v) developer proffers, and (vi) etc. The approach ought to be of wide interest to analysts, planners, policymakers, and stakeholders who rely on heterogeneous data and expertise for risk management.  相似文献   

9.
Risk analysis for biological invasions is similar to other types of natural and human hazards. For example, risk analysis for chemical spills requires the evaluation of basic information on where a spill occurs; exposure level and toxicity of the chemical agent; knowledge of the physical processes involved in its rate and direction of spread; and potential impacts to the environment, economy, and human health relative to containment costs. Unlike typical chemical spills, biological invasions can have long lag times from introduction and establishment to successful invasion, they reproduce, and they can spread rapidly by physical and biological processes. We use a risk analysis framework to suggest a general strategy for risk analysis for invasive species and invaded habitats. It requires: (1) problem formation (scoping the problem, defining assessment endpoints); (2) analysis (information on species traits, matching species traits to suitable habitats, estimating exposure, surveys of current distribution and abundance); (3) risk characterization (understanding of data completeness, estimates of the "potential" distribution and abundance; estimates of the potential rate of spread; and probable risks, impacts, and costs); and (4) risk management (containment potential, costs, and opportunity costs; legal mandates and social considerations and information science and technology needs).  相似文献   

10.
As part of its assessment of the health risks associated with exposure to particulate matter (PM), the U.S. Environmental Protection Agency analyzed the risks associated with current levels, and the risk reductions that might be achieved by attainment of alternative PM standards, in two locations in the United States, Philadelphia, and Los Angeles. The concentration-response function describing the relation between a health endpoint and ambient PM concentrations is an important component, and a source of substantial uncertainty, in such risk analyses. In the absence of location-specific estimates, the concentration-response functions necessary for risk assessments in Philadelphia and Los Angeles must be inferred from the available information in other locations. Although the functional form of the concentration-response relations is assumed to be the same everywhere, the value of the PM coefficient in that function may vary from one location to another. Under this model, a distribution describes the probability that the PM coefficient in a randomly selected location will lie in any range of interest. An empirical Bayes estimation technique was used to improve the estimation of location-specific concentration-response functions relating mortality to short-term exposure to particles of aerodynamic diameter less than or equal to 2.5 microm (PM-2.5), for which functions have previously been estimated in several locations. The empirical Bayes-adjusted parameter values and their SEs were used to derive an estimate of the distribution of PM-2.5 coefficients for mortality associated with short-term exposures. From this distribution, distributions of relative risks corresponding to different specified changes in PM-2.5 concentrations could be derived.  相似文献   

11.
Travel Risks in a Time of Terror: Judgments and Choices   总被引:1,自引:0,他引:1  
Shortly after the 2002 terrorist attacks in Bali, readers of Conde Nast Traveler magazine were surveyed regarding their views on the risks of travel to various destinations. Their risk estimates were highest for Israel, and lowest for Canada. Estimates for the different destinations correlated positively with (1) one another, (2) concern over aspects of travel that can make one feel at risk (e.g., sticking out as an American), (3) worries about other travel problems (e.g., contracting an infectious disease), and (4) attitudes toward risk. Respondents' willingness to travel to a destination was predicted well by whether their estimate of its risk was above or below their general threshold for the acceptability of travel risks. Overall, the responses suggest orderly choices, based on highly uncertain judgments of risks. Worry played a significant role in these choices, even after controlling for cognitive considerations, thereby supporting the recently proposed "risk as feelings" hypothesis. Thus, even among people who have generally consistent and defensible beliefs, emotions may affect choices. These results emerged with people selected for their interest in and experience with the decision domain (travel), but challenged to incorporate a new concern (terror).  相似文献   

12.
Approaches to risk assessment have been shown to vary among regulatory agencies and across jurisdictional boundaries according to the different assumptions and justifications used. Approaches to screening-level risk assessment from six international agencies were applied to an urban case study focusing on benzo[a]pyrene (B[a]P) exposure and compared in order to provide insight into the differences between agency methods, assumptions, and justifications. Exposure estimates ranged four-fold, with most of the dose stemming from exposure to animal products (8-73%) and plant products (24-88%). Total cancer risk across agencies varied by two orders of magnitude, with exposure to air and plant and animal products contributing most to total cancer risk, while the air contribution showed the greatest variability (1-99%). Variability in cancer risk of 100-fold was attributed to choices of toxicological reference values (TRVs), either based on a combination of epidemiological and animal data, or on animal data. The contribution and importance of the urban exposure pathway for cancer risk varied according to the TRV and, ultimately, according to differences in risk assessment assumptions and guidance. While all agency risk assessment methods are predicated on science, the study results suggest that the largest impact on the differential assessment of risk by international agencies comes from policy and judgment, rather than science.  相似文献   

13.
This paper describes the U.S. Environmental Protection Agency's assessment of potential health risks associated with the possible widespread use of a manganese (Mn)-based fuel additive, methylcyclopentadienyl manganese tricarbonyl (MMT). This assessment was significant in several respects and may be instructive in identifying certain methodological issues of general relevance to risk assessment. A major feature of the inhalation health risk assessment was the derivation of Mn inhalation reference concentration (RfC) estimates using various statistical approaches, including benchmark dose and Bayesian analyses. The exposure assessment component used data from the Particle Total Exposure Assessment Methodology (PTEAM) study and other sources to estimate personal exposure levels of particulate Mn attributable to the permitted use of MMT in leaded gasoline in Riverside, CA, at the time of the PTEAM study; on this basis it was then possible to predict a distribution of possible future exposure levels associated with the use of MMT in all unleaded gasoline. Qualitative as well as quantitative aspects of the risk characterization are summarized, along with inherent uncertainties due to data limitations.  相似文献   

14.
《Risk analysis》2018,38(6):1223-1238
Implementation of probabilistic analyses in exposure assessment can provide valuable insight into the risks of those at the extremes of population distributions, including more vulnerable or sensitive subgroups. Incorporation of these analyses into current regulatory methods for occupational pesticide exposure is enabled by the exposure data sets and associated data currently used in the risk assessment approach of the Environmental Protection Agency (EPA). Monte Carlo simulations were performed on exposure measurements from the Agricultural Handler Exposure Database and the Pesticide Handler Exposure Database along with data from the Exposure Factors Handbook and other sources to calculate exposure rates for three different neurotoxic compounds (azinphos methyl, acetamiprid, emamectin benzoate) across four pesticide‐handling scenarios. Probabilistic estimates of doses were compared with the no observable effect levels used in the EPA occupational risk assessments. Some percentage of workers were predicted to exceed the level of concern for all three compounds: 54% for azinphos methyl, 5% for acetamiprid, and 20% for emamectin benzoate. This finding has implications for pesticide risk assessment and offers an alternative procedure that may be more protective of those at the extremes of exposure than the current approach.  相似文献   

15.
Typical exposures to lead often involve a mix of long-term exposures to relatively constant exposure levels (e.g., residential yard soil and indoor dust) and highly intermittent exposures at other locations (e.g., seasonal recreational visits to a park). These types of exposures can be expected to result in blood lead concentrations that vary on a temporal scale with the intermittent exposure pattern. Prediction of short-term (or seasonal) blood lead concentrations arising from highly variable intermittent exposures requires a model that can reliably simulate lead exposures and biokinetics on a temporal scale that matches that of the exposure events of interest. If exposure model averaging times (EMATs) of the model exceed the shortest exposure duration that characterizes the intermittent exposure, uncertainties will be introduced into risk estimates because the exposure concentration used as input to the model must be time averaged to account for the intermittent nature of the exposure. We have used simulation as a means of determining the potential magnitude of these uncertainties. Simulations using models having various EMATs have allowed exploration of the strengths and weaknesses of various approaches to time averaging of exposures and impact on risk estimates associated with intermittent exposures to lead in soil. The International Commission of Radiological Protection (ICRP) model of lead pharmacokinetics in humans simulates lead intakes that can vary in intensity over time spans as small as one day, allowing for the simulation of intermittent exposures to lead as a series of discrete daily exposure events. The ICRP model was used to compare the outcomes (blood lead concentration) of various time-averaging adjustments for approximating the time-averaged intake of lead associated with various intermittent exposure patterns. Results of these analyses suggest that standard approaches to time averaging (e.g., U.S. EPA) that estimate the long-term daily exposure concentration can, in some cases, result in substantial underprediction of short-term variations in blood lead concentrations when used in models that operate with EMATs exceeding the shortest exposure duration that characterizes the intermittent exposure. Alternative time-averaging approaches recommended for use in lead risk assessment more reliably predict short-term periodic (e.g., seasonal) elevations in blood lead concentration that might result from intermittent exposures. In general, risk estimates will be improved by simulation on shorter time scales that more closely approximate the actual temporal dynamics of the exposure.  相似文献   

16.
Kenneth T. Bogen 《Risk analysis》2014,34(10):1780-1784
A 2009 report of the National Research Council (NRC) recommended that the U.S. Environmental Protection Agency (EPA) increase its estimates of increased cancer risk from exposure to environmental agents by ~7‐fold, due to an approximate ~25‐fold typical ratio between the median and upper 95th percentile persons’ cancer sensitivity assuming approximately lognormally distributed sensitivities. EPA inaction on this issue has raised concerns that cancer risks to environmentally exposed populations remain systematically underestimated. This concern is unwarranted, however, because EPA point estimates of cancer risk have always pertained to the average, not the median, person in each modeled exposure group. Nevertheless, EPA has yet to explain clearly how its risk characterization and risk management policies concerning individual risks from environmental chemical carcinogens do appropriately address broad variability in human cancer susceptibility that has been a focus of two major NRC reports to EPA concerning its risk assessment methods.  相似文献   

17.
This article quantifies potential public health risks from tumor-producing pollutants emitted from two synthetic-fuel plants (direct liquefaction--Exxon Donor Solvent: and indirect liquefaction--Lurgi Fischer-Tropsch) located at a representative site in the eastern United States. In these analyses gaseous and aqueous waste streams were characterized; exposures via inhalation, terrestrial and aquatic food chains, and drinking water supplies were modeled. Analysis suggested that emissions of "polycyclic aromatic hydrocarbons," "aromatic amines," "neutral N, O, S heterocyclics," "nitriles," and "other trace elements" pose the largest quantifiable risks to public health. Data and analysis for these pollutant categories should be refined to more accurately match compound-specific estimated exposure levels with tumorigenic potency estimates. Before these results are used for regulatory purposes, more detailed analysis for selected pollutant classes are needed, and more sophisticated aquatic exposure models must be developed. Also, differences in geographic scales among the environmental transport models used need to be rectified.  相似文献   

18.
Future development in cities needs to manage increasing populations, climate‐related risks, and sustainable development objectives such as reducing greenhouse gas emissions. Planners therefore face a challenge of multidimensional, spatial optimization in order to balance potential tradeoffs and maximize synergies between risks and other objectives. To address this, a spatial optimization framework has been developed. This uses a spatially implemented genetic algorithm to generate a set of Pareto‐optimal results that provide planners with the best set of trade‐off spatial plans for six risk and sustainability objectives: (i) minimize heat risks, (ii) minimize flooding risks, (iii) minimize transport travel costs to minimize associated emissions, (iv) maximize brownfield development, (v) minimize urban sprawl, and (vi) prevent development of greenspace. The framework is applied to Greater London (U.K.) and shown to generate spatial development strategies that are optimal for specific objectives and differ significantly from the existing development strategies. In addition, the analysis reveals tradeoffs between different risks as well as between risk and sustainability objectives. While increases in heat or flood risk can be avoided, there are no strategies that do not increase at least one of these. Tradeoffs between risk and other sustainability objectives can be more severe, for example, minimizing heat risk is only possible if future development is allowed to sprawl significantly. The results highlight the importance of spatial structure in modulating risks and other sustainability objectives. However, not all planning objectives are suited to quantified optimization and so the results should form part of an evidence base to improve the delivery of risk and sustainability management in future urban development.  相似文献   

19.
Mortality effects of exposure to air pollution and other environmental hazards are often described by the estimated number of “premature” or “attributable” deaths and the economic value of a reduction in exposure as the product of an estimate of “statistical lives saved” and a “value per statistical life.” These terms can be misleading because the number of deaths advanced by exposure cannot be determined from mortality data alone, whether from epidemiology or randomized trials (it is not statistically identified). The fraction of deaths “attributed” to exposure is conventionally derived as the hazard fraction (R – 1)/R, where R is the relative risk of mortality between high and low exposure levels. The fraction of deaths advanced by exposure (the “etiologic” fraction) can be substantially larger or smaller: it can be as large as one and as small as 1/e (≈0.37) times the hazard fraction (if the association is causal and zero otherwise). Recent literature reveals misunderstanding about these concepts. Total life years lost in a population due to exposure can be estimated but cannot be disaggregated by age or cause of death. Economic valuation of a change in exposure-related mortality risk to a population is not affected by inability to know the fraction of deaths that are etiologic. When individuals facing larger or smaller changes in mortality risk cannot be identified, the mean change in population hazard is sufficient for valuation; otherwise, the economic value can depend on the distribution of risk reductions.  相似文献   

20.
The crashes of four hijacked commercial planes on September 11, 2001, and the repeated televised images of the consequent collapse of the World Trade Center and one side of the Pentagon will inevitably change people's perceptions of the mortality risks to people on the ground from crashing airplanes. Goldstein and colleagues were the first to quantify the risk for Americans of being killed on the ground from a crashing airplane for unintentional events, providing average point estimates of 6 in a hundred million for annual risk and 4.2 in a million for lifetime risk. They noted that the lifetime risk result exceeded the commonly used risk management threshold of 1 in a million, and suggested that the risk to "groundlings" could be a useful risk communication tool because (a) it is a man-made risk (b) arising from economic activities (c) from which the victims derive no benefit and (d) exposure to which the victims cannot control. Their results have been used in risk communication. This analysis provides updated estimates of groundling fatality risks from unintentional crashes using more recent data and a geographical information system approach to modeling the population around airports. The results suggest that the average annual risk is now 1.2 in a hundred million and the lifetime risk is now 9 in ten million (below the risk management threshold). Analysis of the variability and uncertainty of this estimate, however, suggests that the exposure to groundling fatality risk varies by about a factor of approximately 100 in the spatial dimension of distance to an airport, with the risk declining rapidly outside the first 2 miles around an airport. We believe that the risk to groundlings from crashing airplanes is more useful in the context of risk communication when information about variability and uncertainty in the risk estimates is characterized, but we suspect that recent events will alter its utility in risk communication.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号