首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
Risk assessment is the process of estimating the likelihood that an adverse effect may result from exposure to a specific health hazard. The process traditionally involves hazard identification, dose-response assessment, exposure assessment, and risk characterization to answer “How many excess cases of disease A will occur in a population of size B due to exposure to agent C at dose level D?” For natural hazards, however, we modify the risk assessment paradigm to answer “How many excess cases of outcome Y will occur in a population of size B due to natural hazard event E of severity D?” Using a modified version involving hazard identification, risk factor characterization, exposure characterization, and risk characterization, we demonstrate that epidemiologic modeling and measures of risk can quantify the risks from natural hazard events. We further extend the paradigm to address mitigation, the equivalent of risk management, to answer “What is the risk for outcome Y in the presence of prevention intervention X relative to the risk for Y in the absence of X?” We use the preventable fraction to estimate the efficacy of mitigation, or reduction in adverse health outcomes as a result of a prevention strategy under ideal circumstances, and further estimate the effectiveness of mitigation, or reduction in adverse health outcomes under typical community-based settings. By relating socioeconomic costs of mitigation to measures of risk, we illustrate that prevention effectiveness is useful for developing cost-effective risk management options.  相似文献   

2.
In this paper we show that a striking improvement in the explanatory power of a “dividend type” of security valuation model can be obtained by classifying companies into equivalent risk categories, estimating the discount factor for a category, and then constructing a cross-sectional model for it. The increased homogenity of the data base improves the model's sensitivity to systematic forces, but does not sacrifice the heterogeneity of the independent variables. Assuming that the difference between the intrinsic value of a security and its market value should be zero, the authors demonstrate a method for estimating kjt, the market discount rate for the jth risk category in the tth period. The results of the estimation procedure appear to be reasonable and when used in our security valuation model they produce higher coefficients of determination (R2) than those previously published for similar models.  相似文献   

3.
《Risk analysis》2018,38(8):1672-1684
A disease burden (DB) evaluation for environmental pathogens is generally performed using disability‐adjusted life years with the aim of providing a quantitative assessment of the health hazard caused by pathogens. A critical step in the preparation for this evaluation is the estimation of morbidity between exposure and disease occurrence. In this study, the method of a traditional dose–response analysis was first reviewed, and then a combination of the theoretical basis of a “single‐hit” and an “infection‐illness” model was performed by incorporating two critical factors: the “infective coefficient” and “infection duration.” This allowed a dose–morbidity model to be built for direct use in DB calculations. In addition, human experimental data for typical intestinal pathogens were obtained for model validation, and the results indicated that the model was well fitted and could be further used for morbidity estimation. On this basis, a real case of a water reuse project was selected for model application, and the morbidity as well as the DB caused by intestinal pathogens during water reuse was evaluated. The results show that the DB attributed to Enteroviruses was significant, while that for enteric bacteria was negligible. Therefore, water treatment technology should be further improved to reduce the exposure risk of Enteroviruses . Since road flushing was identified as the major exposure route, human contact with reclaimed water through this pathway should be limited. The methodology proposed for model construction not only makes up for missing data of morbidity during risk evaluation, but is also necessary to quantify the maximum possible DB.  相似文献   

4.
Ten years ago, the National Academy of Science released its risk assessment/risk management (RA/RM) “paradigm” that served to crystallize much of the early thinking about these concepts. By defining RA as a four-step process, operationally independent from RM, the paradigm has presented society with a scheme, or a conceptually common framework, for addressing many risky situations (e.g., carcinogens, noncarcinogens, and chemical mixtures). The procedure has facilitated decision-making in a wide variety of situations and has identified the most important research needs. The past decade, however, has revealed that additional progress is needed. These areas include addressing the appropriate interaction (not isolation) between RA and RM, improving the methods for assessing risks from mixtures, dealing with “adversity of effect,” deciding whether “hazard” should imply an exposure to environmental conditions or to laboratory conditions, and evolving the concept to include both health and ecological risk. Interest in and expectations of risk assessment are increasing rapidly. The emerging concept of “comparative risk” (i.e., distinguishing between large risks and smaller risks that may be qualitatively different) is at a level comparable to that held by the concept of “risk” just 10 years ago. Comparative risk stands in need of a paradigm of its own, especially given the current economic limitations. “Times are tough; Brother, can you paradigm?”  相似文献   

5.
This article examines the pricing policy of a monopolist seller who may sell in advance of consumption in a market that comprises of myopic consumers, forward‐looking consumers, and speculators. The latter group has no consumption value for the goods and is in the market with the sole objective of making a profit by reselling the purchased goods shortly after. Consumers, although homogeneous in terms of their valuations, are different with respect to their perspectives. We show that in an “upward” market where the expected valuation increases over time, the optimal pricing policy is an ex ante “static” one where the seller “prices into the future” and prices the myopic consumers out of the advance market. However, in a “downward” market where the expected valuation decreases over time, the seller adopts a dynamic pricing strategy except for the case when higher initial sales can trigger more demand subsequently and when the downward trend is not too high. In this case, the seller prefers an ex ante “static” pricing strategy and deliberately prices lower initially to sell to speculators. We identify the conditions under which the seller benefits from the existence of speculators in the market. Moreover, although the presence of entry costs is ineffective as an entry deterrence, we determine the conditions under which exit costs can rein in speculative purchase.  相似文献   

6.
If a specific biological mechanism could be determined by which a carcinogen increases lung cancer risk, how might this knowledge be used to improve risk assessment? To explore this issue, we assume (perhaps incorrectly) that arsenic in cigarette smoke increases lung cancer risk by hypermethylating the promoter region of gene p16INK4a, leading to a more rapid entry of altered (initiated) cells into a clonal expansion phase. The potential impact on lung cancer of removing arsenic is then quantified using a three‐stage version of a multistage clonal expansion (MSCE) model. This refines the usual two‐stage clonal expansion (TSCE) model of carcinogenesis by resolving its intermediate or “initiated” cell compartment into two subcompartments, representing experimentally observed “patch” and “field” cells. This refinement allows p16 methylation effects to be represented as speeding transitions of cells from the patch state to the clonally expanding field state. Given these assumptions, removing arsenic might greatly reduce the number of nonsmall cell lung cancer cells (NSCLCs) produced in smokers, by up to two‐thirds, depending on the fraction (between 0 and 1) of the smoking‐induced increase in the patch‐to‐field transition rate prevented if arsenic were removed. At present, this fraction is unknown (and could be as low as zero), but the possibility that it could be high (close to 1) cannot be ruled out without further data.  相似文献   

7.
Interview findings suggest perceived proximity to mapped hazards influences risk beliefs when people view environmental hazard maps. For dot maps, four attributes of mapped hazards influenced beliefs: hazard value, proximity, prevalence, and dot patterns. In order to quantify the collective influence of these attributes for viewers’ perceived or actual map locations, we present a model to estimate proximity‐based hazard or risk (PBH) and share study results that indicate how modeled PBH and map attributes influenced risk beliefs. The randomized survey study among 447 university students assessed risk beliefs for 24 dot maps that systematically varied by the four attributes. Maps depicted water test results for a fictitious hazardous substance in private residential wells and included a designated “you live here” location. Of the nine variables that assessed risk beliefs, the numerical susceptibility variable was most consistently and strongly related to map attributes and PBH. Hazard value, location in or out of a clustered dot pattern, and distance had the largest effects on susceptibility. Sometimes, hazard value interacted with other attributes, for example, distance had stronger effects on susceptibility for larger than smaller hazard values. For all combined maps, PBH explained about the same amount of variance in susceptibility as did attributes. Modeled PBH may have utility for studying the influence of proximity to mapped hazards on risk beliefs, protective behavior, and other dependent variables. Further work is needed to examine these influences for more realistic maps and representative study samples.  相似文献   

8.
A major issue in all risk communication efforts is the distinction between the terms “risk” and “hazard.” The potential to harm a target such as human health or the environment is normally defined as a hazard, whereas risk also encompasses the probability of exposure and the extent of damage. What can be observed again and again in risk communication processes are misunderstandings and communication gaps related to these crucial terms. We asked a sample of 53 experts from public authorities, business and industry, and environmental and consumer organizations in Germany to outline their understanding and use of these terms using both the methods of expert interviews and focus groups. The empirical study made clear that the terms risk and hazard are perceived and used very differently in risk communication depending on the perspective of the stakeholders. Several factors can be identified, such as responsibility for hazard avoidance, economic interest, or a watchdog role. Thus, communication gaps can be reduced to a four‐fold problem matrix comprising a semantic, conceptual, strategic, and control problem. The empirical study made clear that risks and hazards are perceived very differently depending on the stakeholders’ perspective. Their own worldviews played a major role in their specific use of the two terms hazards and risks in communication.  相似文献   

9.
Louis Anthony Cox  Jr. 《Risk analysis》2009,29(12):1664-1671
Do pollution emissions from livestock operations increase infant mortality rate (IMR)? A recent regression analysis of changes in IMR against changes in aggregate “animal units” (a weighted sum of cattle, pig, and poultry numbers) over time, for counties throughout the United States, suggested the provocative conclusion that they do: “[A] doubling of production leads to a 7.4% increase in infant mortality.” Yet, we find that regressing IMR changes against changes in specific components of “animal units” (cattle, pigs, and broilers) at the state level reveals statistically significant negative associations between changes in livestock production (especially, cattle production) and changes in IMR. We conclude that statistical associations between livestock variables and IMR variables are very sensitive to modeling choices (e.g., selection of explanatory variables, and use of specific animal types vs. aggregate “animal units). Such associations, whether positive or negative, do not warrant causal interpretation. We suggest that standard methods of quantitative risk assessment (QRA), including emissions release (source) models, fate and transport modeling, exposure assessment, and dose-response modeling, really are important—and indeed, perhaps, essential—for drawing valid causal inferences about health effects of exposures to guide sound, well-informed public health risk management policy. Reduced-form regression models, which skip most or all of these steps, can only quantify statistical associations (which may be due to model specification, variable selection, residual confounding, or other noncausal factors). Sound risk management requires the extra work needed to identify and model valid causal relations.  相似文献   

10.
Adam M. Finkel 《Risk analysis》2014,34(10):1785-1794
If exposed to an identical concentration of a carcinogen, every human being would face a different level of risk, determined by his or her genetic, environmental, medical, and other uniquely individual characteristics. Various lines of evidence indicate that this susceptibility variable is distributed rather broadly in the human population, with perhaps a factor of 25‐ to 50‐fold between the center of this distribution and either of its tails, but cancer risk assessment at the EPA and elsewhere has always treated every (adult) human as identically susceptible. The National Academy of Sciences “Silver Book” concluded that EPA and the other agencies should fundamentally correct their mis‐computation of carcinogenic risk in two ways: (1) adjust individual risk estimates upward to provide information about the upper tail; and (2) adjust population risk estimates upward (by about sevenfold) to correct an underestimation due to a mathematical property of the interindividual distribution of human susceptibility, in which the susceptibility averaged over the entire (right‐skewed) population exceeds the median value for the typical human. In this issue of Risk Analysis, Kenneth Bogen disputes the second adjustment and endorses the first, though he also relegates the problem of underestimated individual risks to the realm of “equity concerns” that he says should have little if any bearing on risk management policy. In this article, I show why the basis for the population risk adjustment that the NAS recommended is correct—that current population cancer risk estimates, whether they are derived from animal bioassays or from human epidemiologic studies, likely provide estimates of the median with respect to human variation, which in turn must be an underestimate of the mean. If cancer risk estimates have larger “conservative” biases embedded in them, a premise I have disputed in many previous writings, such a defect would not excuse ignoring this additional bias in the direction of underestimation. I also demonstrate that sensible, legally appropriate, and ethical risk policy must not only inform the public when the tail of the individual risk distribution extends into the “high‐risk” range, but must alter benefit‐cost balancing to account for the need to try to reduce these tail risks preferentially.  相似文献   

11.
We consider a situation in which shippers (customers) can purchase ocean freight services either directly from a carrier (service provider)in advance or from the spot market just before the departure of an ocean liner. The price is known in the former case, while the spot price is uncertain ex‐ante in the latter case. Consequently, some shippers are reluctant to book directly from the carrier in advance unless the carrier is willing to “partially match” the realized spot price when it is lower than the regular price. This study is an initial attempt to examine if the carrier should bear some of the “price risk” by offering a “fractional” price matching contract that can be described as follows. The shipper pays the regular freight price in advance; however, the shipper will get a refund if the realized spot price is below the regular price, where the refund is a “fraction” of the difference between the regular price and the realized spot price. By modeling the dynamics between the carrier and the shippers as a sequential game, we show that the carrier can use the fractional price matching contract to generate a higher demand from the shippers compared to no price matching contract by increasing the “fraction” in equilibrium. However, as the carrier increases the “fraction,” the carrier should increase the regular price to compensate for bearing additional risk. By selecting the fractional price matching contract optimally, we show that the carrier can afford to offer this price matching mechanism without incurring revenue loss: the optimal fractional price matching contract is “revenue neutral.”  相似文献   

12.
This article considers all 87 attacks worldwide against air and rail transport systems that killed at least two passengers over the 30‐year period of 1982–2011. The data offer strong and statistically significant evidence that successful acts of terror have “gone to ground” in recent years: attacks against aviation were concentrated early in the three decades studied whereas those against rail were concentrated later. Recent data are used to make estimates of absolute and comparative risk for frequent flyers and subway/rail commuters. Point estimates in the “status quo” case imply that mortality risk from successful acts of terror was very low on both modes of transportation and that, whereas risk per trip is higher for air travelers than subway/rail commuters, the rail commuters experience greater risk per year than the frequent flyers.  相似文献   

13.
A new technique for deriving exogenous components of mortality risks from national vital statistics has been developed. Each observed death rate Dij (where i corresponds to calendar time (year or interval of years) and j denotes the number of corresponding age group) was represented as Dij=Aj+BiCj, and unknown quantities Aj, Bi, and Cj were estimated by a special procedure using the least-squares principle. The coefficients of variation do not exceed 10%. It is shown that the term Aj can be interpreted as the endogenous and the second term BiCj as the exogenous components of the death rate. The aggregate of endogenous components Aj can be described by a regression function, corresponding to the Gompertz-Makeham law, A(τ) =γ+β· eατ, where γ, β, and α are constants, τ is age, AττAττAj, and τj, is the value of age τ in jth age group. The coefficients of variation for such a representation does not exceed 4%. An analysis of exogenous risk levels in the Moscow and Russian populations during 1980–1995 shows that since 1992 all components of exogenous risk in the Moscow population had been increasing up to 1994. The greatest contribution to the total level of exogenous risk was lethal diseases, and their death rate was 387 deaths per 100,000 persons in 1994, i.e., 61.9% of all deaths. The dynamics of exogenous mortality risk change during 1990–1994 in the Moscow population and in the Russian population without Moscow had been identical: the risk had been increasing, and its value in the Russian population had been higher than that in the Moscow population.  相似文献   

14.
Cost-benefit analyses of life-saving public programs typically focus on the number of expected deaths avoided (statistical lives saved) as the metric for evaluating benefits. Although this measure of population risk is clearly important, it ignores the distribution of underlying individual risks. A similar number of lives can be saved by protecting relatively large populations with relatively low baseline risk as can be saved by protecting smaller populations faced with higher baseline risks. Should the value of saving a statistical life be sensitive to the baseline levels of risk to exposed individuals? This paper addresses this issue by focusing specifically on individuals’ altruistic values with regard to life-saving programs. Using results from a survey, this study finds that when individuals are asked to state their preference for equally costly life-saving programs that will only affect others’ level of risk, they prefer those that save more lives. More importantly, however, controlling for the number of lives saved, they also prefer programs that affect smaller populations facing higher levels of baseline risk. Furthermore, the results suggest that each order-of-magnitude increase in the level of baseline risk to others approximately doubles the altruistic value component of a statistical life saved.  相似文献   

15.
Risk assessments are crucial for identifying and mitigating impacts from biological invasions. The Fish Invasiveness Scoring Kit (FISK) is a risk identification (screening) tool for freshwater fishes consisting of two subject areas: biogeography/history and biology/ecology. According to the outcomes, species can be classified under particular risk categories. The aim of this study was to apply FISK to the Iberian Peninsula, a Mediterranean climate region highly important for freshwater fish conservation due to a high level of endemism. In total, 89 fish species were assessed by three independent assessors. Results from receiver operating characteristic analysis showed that FISK can discriminate reliably between noninvasive and invasive fishes for Iberia, with a threshold of 20.25, similar to those obtained in several regions around the world. Based on mean scores, no species was categorized as “low risk,” 50 species as “medium risk,” 17 as “moderately high risk,” 11 as “high risk,” and 11 as “very high risk.” The highest scoring species was goldfish Carassius auratus. Mean certainty in response was above the category “mostly certain,” ranging from tinfoil barb Barbonymus schwanenfeldii with the lowest certainty to eastern mosquitofish Gambusia holbrooki with the highest level. Pair‐wise comparison showed significant differences between one assessor and the other two on mean certainty, with these two assessors showing a high coincidence rate for the species categorization. Overall, the results suggest that FISK is a useful and viable tool for assessing risks posed by non‐native fish in the Iberian Peninsula and contributes to a “watch list” in this region.  相似文献   

16.
《Risk analysis》2018,38(1):17-30
The extent of economic losses due to a natural hazard and disaster depends largely on the spatial distribution of asset values in relation to the hazard intensity distribution within the affected area. Given that statistical data on asset value are collected by administrative units in China, generating spatially explicit asset exposure maps remains a key challenge for rapid postdisaster economic loss assessment. The goal of this study is to introduce a top‐down (or downscaling) approach to disaggregate administrative‐unit level asset value to grid‐cell level. To do so, finding the highly correlated “surrogate” indicators is the key. A combination of three data sets—nighttime light grid, LandScan population grid, and road density grid, is used as ancillary asset density distribution information for spatializing the asset value. As a result, a high spatial resolution asset value map of China for 2015 is generated. The spatial data set contains aggregated economic value at risk at 30 arc‐second spatial resolution. Accuracy of the spatial disaggregation reflects redistribution errors introduced by the disaggregation process as well as errors from the original ancillary data sets. The overall accuracy of the results proves to be promising. The example of using the developed disaggregated asset value map in exposure assessment of watersheds demonstrates that the data set offers immense analytical flexibility for overlay analysis according to the hazard extent. This product will help current efforts to analyze spatial characteristics of exposure and to uncover the contributions of both physical and social drivers of natural hazard and disaster across space and time.  相似文献   

17.
Modern theories in cognitive psychology and neuroscience indicate that there are two fundamental ways in which human beings comprehend risk. The “analytic system” uses algorithms and normative rules, such as probability calculus, formal logic, and risk assessment. It is relatively slow, effortful, and requires conscious control. The “experiential system” is intuitive, fast, mostly automatic, and not very accessible to conscious awareness. The experiential system enabled human beings to survive during their long period of evolution and remains today the most natural and most common way to respond to risk. It relies on images and associations, linked by experience to emotion and affect (a feeling that something is good or bad). This system represents risk as a feeling that tells us whether it is safe to walk down this dark street or drink this strange‐smelling water. Proponents of formal risk analysis tend to view affective responses to risk as irrational. Current wisdom disputes this view. The rational and the experiential systems operate in parallel and each seems to depend on the other for guidance. Studies have demonstrated that analytic reasoning cannot be effective unless it is guided by emotion and affect. Rational decision making requires proper integration of both modes of thought. Both systems have their advantages, biases, and limitations. Now that we are beginning to understand the complex interplay between emotion and reason that is essential to rational behavior, the challenge before us is to think creatively about what this means for managing risk. On the one hand, how do we apply reason to temper the strong emotions engendered by some risk events? On the other hand, how do we infuse needed “doses of feeling” into circumstances where lack of experience may otherwise leave us too “coldly rational”? This article addresses these important questions.  相似文献   

18.
Government actions to reduce risks to health have varied greatly in their cost per death prevented, frequently by 10-fold or even 100-fold. This research asks whether disparities of this magnitude are justified by citizens' preferences abut the relative value of reducing deaths from different hazards. Four samples were asked to rank the relative priority of preventing deaths through 8 realistic programs, each addressed to a different hazard, and then to rate how large the differences in spending should be. Subjects were not asked to give absolute values on preventing deaths and were asked only for their relative valuation of the benefits of preventing a death, not to weigh the benefits and costs or to determine an optimal spending level. We found that in all samples the median respondent valued his top-rated program 5 to 6 times more than his bottom-rated program. However, because individuals disagreed upon the relative priority for different programs, the aggregated rankings barely showed more than a 2-fold difference in the amounts that should be spent. Thus, for the important programs considered by these samples, a large variation in spending does not appear to be justified on the basis of differentials in the values placed on preventing different types of deaths. A more deliberative methodology like the one used here appears fruitful for providing insights to policymakers about preferences in this sensitive area.  相似文献   

19.
The value of a statistical life (VSL) is a widely used measure for the value of mortality risk reduction. As VSL should reflect preferences and attitudes to risk, there are reasons to believe that it varies depending on the type of risk involved. It has been argued that cancer should be considered a “dread disease,” which supports the use of a “cancer premium.” The objective of this study is to investigate the existence of a cancer premium (for pancreatic cancer and multiple myeloma) in relation to road traffic accidents, sudden cardiac arrest, and amyotrophic lateral sclerosis (ALS). Data were collected from 500 individuals in the Swedish general population of 50–74‐year olds using a web‐based questionnaire. Preferences were elicited using the contingent valuation method, and a split‐sample design was applied to test scale sensitivity. VSL differs significantly between contexts, being highest for ALS and lowest for road traffic accidents. A premium (92–113%) for cancer was found in relation to road traffic accidents. The premium was higher for cancer with a shorter time from diagnosis to death. A premium was also found for sudden cardiac arrest (73%) and ALS (118%) in relation to road traffic accidents. Eliminating risk was associated with a premium of around 20%. This study provides additional evidence that there exist a dread premium and risk elimination premium. These factors should be considered when searching for an appropriate value for economic evaluation and health technology assessment.  相似文献   

20.
I recently discussed pitfalls in attempted causal inference based on reduced‐form regression models. I used as motivation a real‐world example from a paper by Dr. Sneeringer, which interpreted a reduced‐form regression analysis as implying the startling causal conclusion that “doubling of [livestock] production leads to a 7.4% increase in infant mortality.” This conclusion is based on: (A) fitting a reduced‐form regression model to aggregate (e.g., county‐level) data; and (B) (mis)interpreting a regression coefficient in this model as a causal coefficient, without performing any formal statistical tests for potential causation (such as conditional independence, Granger‐Sims, or path analysis tests). Dr. Sneeringer now adds comments that confirm and augment these deficiencies, while advocating methodological errors that, I believe, risk analysts should avoid if they want to reach logically sound, empirically valid, conclusions about cause and effect. She explains that, in addition to (A) and (B) above, she also performed other steps such as (C) manually selecting specific models and variables and (D) assuming (again, without testing) that hand‐picked surrogate variables are valid (e.g., that log‐transformed income is an adequate surrogate for poverty). In her view, these added steps imply that “critiques of A and B are not applicable” to her analysis and that therefore “a causal argument can be made” for “such a strong, robust correlation” as she believes her regression coefficient indicates. However, multiple wrongs do not create a right. Steps (C) and (D) exacerbate the problem of unjustified causal interpretation of regression coefficients, without rendering irrelevant the fact that (A) and (B) do not provide evidence of causality. This reply focuses on whether any statistical techniques can produce the silk purse of a valid causal inference from the sow's ear of a reduced‐form regression analysis of ecological data. We conclude that Dr. Sneeringer's analysis provides no valid indication that air pollution from livestock operations causes any increase in infant mortality rates. More generally, reduced‐form regression modeling of aggregate population data—no matter how it is augmented by fitting multiple models and hand‐selecting variables and transformations—is not adequate for valid causal inference about health effects caused by specific, but unmeasured, exposures.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号