首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
We examine whether the risk characterization estimated by catastrophic loss projection models is sensitive to the revelation of new information regarding risk type. We use commercial loss projection models from two widely employed modeling firms to estimate the expected hurricane losses of Florida Atlantic University's building stock, both including and excluding secondary information regarding hurricane mitigation features that influence damage vulnerability. We then compare the results of the models without and with this revealed information and find that the revelation of additional, secondary information influences modeled losses for the windstorm‐exposed university building stock, primarily evidenced by meaningful percent differences in the loss exceedance output indicated after secondary modifiers are incorporated in the analysis. Secondary risk characteristics for the data set studied appear to have substantially greater impact on probable maximum loss estimates than on average annual loss estimates. While it may be intuitively expected for catastrophe models to indicate that secondary risk characteristics hold value for reducing modeled losses, the finding that the primary value of secondary risk characteristics is in reduction of losses in the “tail” (low probability, high severity) events is less intuitive, and therefore especially interesting. Further, we address the benefit‐cost tradeoffs that commercial entities must consider when deciding whether to undergo the data collection necessary to include secondary information in modeling. Although we assert the long‐term benefit‐cost tradeoff is positive for virtually every entity, we acknowledge short‐term disincentives to such an effort.  相似文献   

2.
Three modeling systems were used to estimate human health risks from air pollution: two versions of MNRiskS (for Minnesota Risk Screening), and the USEPA National Air Toxics Assessment (NATA). MNRiskS is a unique cumulative risk modeling system used to assess risks from multiple air toxics, sources, and pathways on a local to a state‐wide scale. In addition, ambient outdoor air monitoring data were available for estimation of risks and comparison with the modeled estimates of air concentrations. Highest air concentrations and estimated risks were generally found in the Minneapolis‐St. Paul metropolitan area and lowest risks in undeveloped rural areas. Emissions from mobile and area (nonpoint) sources created greater estimated risks than emissions from point sources. Highest cancer risks were via ingestion pathway exposures to dioxins and related compounds. Diesel particles, acrolein, and formaldehyde created the highest estimated inhalation health impacts. Model‐estimated air concentrations were generally highest for NATA and lowest for the AERMOD version of MNRiskS. This validation study showed reasonable agreement between available measurements and model predictions, although results varied among pollutants, and predictions were often lower than measurements. The results increased confidence in identifying pollutants, pathways, geographic areas, sources, and receptors of potential concern, and thus provide a basis for informing pollution reduction strategies and focusing efforts on specific pollutants (diesel particles, acrolein, and formaldehyde), geographic areas (urban centers), and source categories (nonpoint sources). The results heighten concerns about risks from food chain exposures to dioxins and PAHs. Risk estimates were sensitive to variations in methodologies for treating emissions, dispersion, deposition, exposure, and toxicity.  相似文献   

3.
A Latin Hypercube probabilistic risk assessment methodology was employed in the assessment of health risks associated with exposures to contaminated sediment and biota in an estuary in the Tidewater region of Virginia. The primary contaminants were polychlorinated biphenyls (PCBs), polychlorinated terphenyls (PCTs), polynuclear aromatic hydrocarbons (PAHs), and metals released into the estuary from a storm sewer system. The exposure pathways associated with the highest contaminant intake and risks were dermal contact with contaminated sediment and ingestion of contaminated aquatic and terrestrial biota from the contaminated area. As expected, all of the output probability distributions of risk were highly skewed, and the ratios of the expected value (mean) to median risk estimates ranged from 1.4 to 14.8 for the various exposed populations. The 99th percentile risk estimates were as much as two orders of magnitude above the mean risk estimates. For the sediment exposure pathways, the stability of the median risk estimates was found to be much greater than the stability of the expected value risk estimates. The interrun variability in the median risk estimate was found to be +/-1.9% at 3000 iterations. The interrun stability of the mean risk estimates was found to be approximately equal to that of the 95th percentile estimates at any number of iterations. The variation in neither contaminant concentrations nor any other single input variable contributed disproportionately to the overall simulation variance. The inclusion or exclusion of spatial correlations among contaminant concentrations in the simulation model did not significantly effect either the magnitude or the variance of the simulation risk estimates for sediment exposures.  相似文献   

4.
Recent studies demonstrating a concentration dependence of elimination of 2,3,7,8-tetrachlorodibenzo-p-dioxin (TCDD) suggest that previous estimates of exposure for occupationally exposed cohorts may have underestimated actual exposure, resulting in a potential overestimate of the carcinogenic potency of TCDD in humans based on the mortality data for these cohorts. Using a database on U.S. chemical manufacturing workers potentially exposed to TCDD compiled by the National Institute for Occupational Safety and Health (NIOSH), we evaluated the impact of using a concentration- and age-dependent elimination model (CADM) (Aylward et al., 2005) on estimates of serum lipid area under the curve (AUC) for the NIOSH cohort. These data were used previously by Steenland et al. (2001) in combination with a first-order elimination model with an 8.7-year half-life to estimate cumulative serum lipid concentration (equivalent to AUC) for these workers for use in cancer dose-response assessment. Serum lipid TCDD measurements taken in 1988 for a subset of the cohort were combined with the NIOSH job exposure matrix and work histories to estimate dose rates per unit of exposure score. We evaluated the effect of choices in regression model (regression on untransformed vs. ln-transformed data and inclusion of a nonzero regression intercept) as well as the impact of choices of elimination models and parameters on estimated AUCs for the cohort. Central estimates for dose rate parameters derived from the serum-sampled subcohort were applied with the elimination models to time-specific exposure scores for the entire cohort to generate AUC estimates for all cohort members. Use of the CADM resulted in improved model fits to the serum sampling data compared to the first-order models. Dose rates varied by a factor of 50 among different combinations of elimination model, parameter sets, and regression models. Use of a CADM results in increases of up to five-fold in AUC estimates for the more highly exposed members of the cohort compared to estimates obtained using the first-order model with 8.7-year half-life. This degree of variation in the AUC estimates for this cohort would affect substantially the cancer potency estimates derived from the mortality data from this cohort. Such variability and uncertainty in the reconstructed serum lipid AUC estimates for this cohort, depending on elimination model, parameter set, and regression model, have not been described previously and are critical components in evaluating the dose-response data from the occupationally exposed populations.  相似文献   

5.
We analyzed the 1980 U.S. vital statistics and available ambient air pollution data bases for sulfates and fine, inhalable, and total suspended particles. Using multiple regression analyses, we conducted a cross-sectional analysis of the association between various particle measures and total mortality. Results from the various analyses indicated the importance of considering particle size, composition, and source information in modeling of particle pollution health effects. Of the independent mortality predictors considered, particle exposure measures related to the respirable and/or toxic fraction of the aerosols, such as fine particles and sulfates, were most consistently and significantly associated with the reported SMSA-specific total annual mortality rates. On the other hand, particle mass measures that included coarse particles (e.g., total suspended particles and inhalable particles) were often found to be nonsignificant predictors of total mortality. Furthermore, based on the application of fine particle source apportionment, particles from industrial sources (e.g., from iron/steel emissions) and from coal combustion were suggested to be more significant contributors to human mortality than soil-derived particles.  相似文献   

6.
This article presents a general model for estimating population heterogeneity and "lack of knowledge" uncertainty in methylmercury (MeHg) exposure assessments using two-dimensional Monte Carlo analysis. Using data from fish-consuming populations in Bangladesh, Brazil, Sweden, and the United Kingdom, predictive model estimates of dietary MeHg exposures were compared against those derived from biomarkers (i.e., [Hg]hair and [Hg]blood). By disaggregating parameter uncertainty into components (i.e., population heterogeneity, measurement error, recall error, and sampling error) estimates were obtained of the contribution of each component to the overall uncertainty. Steady-state diet:hair and diet:blood MeHg exposure ratios were estimated for each population and were used to develop distributions useful for conducting biomarker-based probabilistic assessments of MeHg exposure. The 5th and 95th percentile modeled MeHg exposure estimates around mean population exposure from each of the four study populations are presented to demonstrate lack of knowledge uncertainty about a best estimate for a true mean. Results from a U.K. study population showed that a predictive dietary model resulted in a 74% lower lack of knowledge uncertainty around a central mean estimate relative to a hair biomarker model, and also in a 31% lower lack of knowledge uncertainty around central mean estimate relative to a blood biomarker model. Similar results were obtained for the Brazil and Bangladesh populations. Such analyses, used here to evaluate alternative models of dietary MeHg exposure, can be used to refine exposure instruments, improve information used in site management and remediation decision making, and identify sources of uncertainty in risk estimates.  相似文献   

7.
Statistical procedures are developed to estimate accident occurrence rates from historical event records, to predict future rates and trends, and to estimate the accuracy of the rate estimates and predictions. Maximum likelihood estimation is applied to several learning models and results are compared to earlier graphical and analytical estimates. The models are based on (1) the cumulative number of operating years, (2) the cumulative number of plants built, and (3) accidents (explicitly), with the accident rate distinctly different before and after an accident. The statistical accuracies of the parameters estimated are obtained in analytical form using the Fisher information matrix. Using data on core damage accidents in electricity producing plants , it is estimated that the probability for a plant to have a serious flaw has decreased from 0.1 to 0.01 during the developmental phase of the nuclear industry. At the same time the equivalent frequency of accidents has decreased from 0.04 per reactor year to 0.0004 per reactor year, partly due to the increasing population of plants.  相似文献   

8.
Managing credit risk in financial institutions requires the ability to forecast aggregate losses on existing loans, predict the length of time that loans will be on the books before prepayment or default, analyze the expected performance of particular segments in the existing portfolio, and project payment patterns of new loans. Described in this paper are tools created for these functions in a large California financial institution. A forecasting model with Markovian structure and nonstationary transition probabilities is used to model the life of a mortgage. Logistic and regression models are used to estimate severity of losses. These models are integrated into a system that allows analysts and managers to depict the expected performance of individual loans and portfolio segments under different economic scenarios. With this information, analysts and managers can establish appropriate loss reserves, suggest pricing differentials to compensate for risk, and make strategic lending decisions.  相似文献   

9.
The cost‐effective mitigation of adverse health effects caused by air pollution requires information on the contribution of different emission sources to exposure. In urban areas the exposure potential of different sources may vary significantly depending on emission height, population density, and other factors. In this study, we quantified this intraurban variability by predicting intake fraction (iF) for 3,066 emission sources in Warsaw, Poland. iF describes the fraction of the pollutant that is inhaled by people in the study area. We considered the following seven pollutants: particulate matter (PM), nitrogen oxides (NOx), sulfur dioxide (SO2), benzo[a] pyrene (BaP), nickel (Ni), cadmium (Cd), and lead (Pb). Emissions for these pollutants were grouped into four emission source categories (Mobile, Area, High Point, and Other Point sources). The dispersion of the pollutants was predicted with the CALPUFF dispersion model using the year 2005 emission rate data and meteorological records. The resulting annual average concentrations were combined with population data to predict the contribution of each individual source to population exposure. The iFs for different pollutant‐source category combinations varied between 51 per million (PM from Mobile sources) and 0.013 per million (sulfate PM from High Point sources). The intraurban iF variability for Mobile sources primary PM emission was from 4 per million to 100 per million with the emission‐weighted iF of 44 per million. These results propose that exposure due to intraurban air pollution emissions could be decreased more effectively by specifically targeting sources with high exposure potency rather than all sources.  相似文献   

10.
Typical exposures to lead often involve a mix of long-term exposures to relatively constant exposure levels (e.g., residential yard soil and indoor dust) and highly intermittent exposures at other locations (e.g., seasonal recreational visits to a park). These types of exposures can be expected to result in blood lead concentrations that vary on a temporal scale with the intermittent exposure pattern. Prediction of short-term (or seasonal) blood lead concentrations arising from highly variable intermittent exposures requires a model that can reliably simulate lead exposures and biokinetics on a temporal scale that matches that of the exposure events of interest. If exposure model averaging times (EMATs) of the model exceed the shortest exposure duration that characterizes the intermittent exposure, uncertainties will be introduced into risk estimates because the exposure concentration used as input to the model must be time averaged to account for the intermittent nature of the exposure. We have used simulation as a means of determining the potential magnitude of these uncertainties. Simulations using models having various EMATs have allowed exploration of the strengths and weaknesses of various approaches to time averaging of exposures and impact on risk estimates associated with intermittent exposures to lead in soil. The International Commission of Radiological Protection (ICRP) model of lead pharmacokinetics in humans simulates lead intakes that can vary in intensity over time spans as small as one day, allowing for the simulation of intermittent exposures to lead as a series of discrete daily exposure events. The ICRP model was used to compare the outcomes (blood lead concentration) of various time-averaging adjustments for approximating the time-averaged intake of lead associated with various intermittent exposure patterns. Results of these analyses suggest that standard approaches to time averaging (e.g., U.S. EPA) that estimate the long-term daily exposure concentration can, in some cases, result in substantial underprediction of short-term variations in blood lead concentrations when used in models that operate with EMATs exceeding the shortest exposure duration that characterizes the intermittent exposure. Alternative time-averaging approaches recommended for use in lead risk assessment more reliably predict short-term periodic (e.g., seasonal) elevations in blood lead concentration that might result from intermittent exposures. In general, risk estimates will be improved by simulation on shorter time scales that more closely approximate the actual temporal dynamics of the exposure.  相似文献   

11.
Point source pollution is one of the main threats to regional environmental health. Based on a water quality model, a methodology to assess the regional risk of point source pollution is proposed. The assessment procedure includes five parts: (1) identifying risk source units and estimating source emissions using Monte Carlo algorithms; (2) observing hydrological and water quality data of the assessed area, and evaluating the selected water quality model; (3) screening out the assessment endpoints and analyzing receptor vulnerability with the Choquet fuzzy integral algorithm; (4) using the water quality model introduced in the second step to predict pollutant concentrations for various source emission scenarios and analyzing hazards of risk sources; and finally, (5) using the source hazard values and receptor vulnerability scores to estimate overall regional risk. The proposed method, based on the Water Quality Analysis Simulation Program (WASP), was applied in the region of the Taipu River, which is in the Taihu Basin, China. Results of source hazard and receptor vulnerability analysis allowed us to describe aquatic ecological, human health, and socioeconomic risks individually, and also integrated risks in the Taipu region, from a series of risk curves. Risk contributions of sources to receptors were ranked, and the spatial distribution of risk levels was presented. By changing the input conditions, we were able to estimate risks for a range of scenarios. Thus, the proposed procedure may also be used by decisionmakers for long‐term dynamic risk prediction.  相似文献   

12.
Modeling for Risk Assessment of Neurotoxic Effects   总被引:2,自引:0,他引:2  
The regulation of noncancer toxicants, including neurotoxicants, has usually been based upon a reference dose (allowable daily intake). A reference dose is obtained by dividing a no-observed-effect level by uncertainty (safety) factors to account for intraspecies and interspecies sensitivities to a chemical. It is assumed that the risk at the reference dose is negligible, but no attempt generally is made to estimate the risk at the reference dose. A procedure is outlined that provides estimates of risk as a function of dose. The first step is to establish a mathematical relationship between a biological effect and the dose of a chemical. Knowledge of biological mechanisms and/or pharmacokinetics can assist in the choice of plausible mathematical models. The mathematical model provides estimates of average responses as a function of dose. Secondly, estimates of risk require selection of a distribution of individual responses about the average response given by the mathematical model. In the case of a normal or lognormal distribution, only an estimate of the standard deviation is needed. The third step is to define an adverse level for a response so that the probability (risk) of exceeding that level can be estimated as a function of dose. Because a firm response level often cannot be established at which adverse biological effects occur, it may be necessary to at least establish an abnormal response level that only a small proportion of individuals would exceed in an unexposed group. That is, if a normal range of responses can be established, then the probability (risk) of abnormal responses can be estimated. In order to illustrate this process, measures of the neurotransmitter serotonin and its metabolite 5-hydroxyindoleacetic acid in specific areas of the brain of rats and monkeys are analyzed after exposure to the neurotoxicant methylene-dioxymethamphetamine. These risk estimates are compared with risk estimates from the quantal approach in which animals are classified as either abnormal or not depending upon abnormal serotonin levels.  相似文献   

13.
Probabilistic risk analyses often construct multistage chance trees to estimate the joint probability of compound events. If random measurement error is associated with some or all of the estimates, we show that resulting estimates of joint probability may be highly skewed. Joint probability estimates based on the analysis of multistage chance trees are more likely than not to be below the true probability of adverse events, but will sometimes substantially overestimate them. In contexts such as insurance markets for environmental risks, skewed distributions of risk estimates amplify the "winner's curse" so that the estimated risk premium for low-probability events is likely to be lower than the normative value. Skewness may result even in unbiased estimators of expected value from simple lotteries, if measurement error is associated with both the probability and pay-off terms. Further, skewness may occur even if the error associated with these two estimates is symmetrically distributed. Under certain circumstances, skewed estimates of expected value may result in risk-neutral decisionmakers exhibiting a tendency to choose a certainty equivalent over a lottery of equal expected value, or vice versa. We show that when distributions of estimates of expected value are, positively skewed, under certain circumstances it will be optimal to choose lotteries with nominal values lower than the value of apparently superior certainty equivalents. Extending the previous work of Goodman (1960), we provide an exact formula for the skewness of products.  相似文献   

14.
In meeting its retail sales obligations, management of a local distribution company (LDC) must determine the extent to which it should rely on spot markets, forward contracts, and the increasingly popular long-term tolling agreements under which it pays a fee to reserve generator capacity. We address these issues by solving a mathematical programming model to derive the efficient frontier that summarizes the optimal tradeoffs available to the LDC between procurement risk and expected cost. To illustrate the approach, we estimate the expected procurement costs and associated variances that proxy for risk through a spot-price regression for the spot-purchase alternative and a variable-cost regression for the tolling-agreement alternative. The estimated regressions yield the estimates required to determine the efficient frontier. We develop several such frontiers under alternative assumptions as to the forward-contract price and the tolling agreement's capacity payment, and discuss the implications of our results for LDC management.  相似文献   

15.
The extensive data from the Blair et al.((1)) epidemiology study of occupational acrylonitrile exposure among 25460 workers in eight plants in the United States provide an excellent opportunity to update quantitative risk assessments for this widely used commodity chemical. We employ the semiparametric Cox relative risk (RR) regression model with a cumulative exposure metric to model cause-specific mortality from lung cancer and all other causes. The separately estimated cause-specific cumulative hazards are then combined to provide an overall estimate of age-specific mortality risk. Age-specific estimates of the additional risk of lung cancer mortality associated with several plausible occupational exposure scenarios are obtained. For age 70, these estimates are all markedly lower than those generated with the cancer potency estimate provided in the USEPA acrylonitrile risk assessment.((2)) This result is consistent with the failure of recent occupational studies to confirm elevated lung cancer mortality among acrylonitrile-exposed workers as was originally reported by O'Berg,((3)) and it calls attention to the importance of using high-quality epidemiology data in the risk assessment process.  相似文献   

16.
This paper describes the U.S. Environmental Protection Agency's assessment of potential health risks associated with the possible widespread use of a manganese (Mn)-based fuel additive, methylcyclopentadienyl manganese tricarbonyl (MMT). This assessment was significant in several respects and may be instructive in identifying certain methodological issues of general relevance to risk assessment. A major feature of the inhalation health risk assessment was the derivation of Mn inhalation reference concentration (RfC) estimates using various statistical approaches, including benchmark dose and Bayesian analyses. The exposure assessment component used data from the Particle Total Exposure Assessment Methodology (PTEAM) study and other sources to estimate personal exposure levels of particulate Mn attributable to the permitted use of MMT in leaded gasoline in Riverside, CA, at the time of the PTEAM study; on this basis it was then possible to predict a distribution of possible future exposure levels associated with the use of MMT in all unleaded gasoline. Qualitative as well as quantitative aspects of the risk characterization are summarized, along with inherent uncertainties due to data limitations.  相似文献   

17.
The disease burden of pathogens as estimated by QMRA (quantitative microbial risk assessment) and EA (epidemiological analysis) often differs considerably. This is an unsatisfactory situation for policymakers and scientists. We explored methods to obtain a unified estimate using campylobacteriosis in the Netherlands as an example, where previous work resulted in estimates of 4.9 million (QMRA) and 90,600 (EA) cases per year. Using the maximum likelihood approach and considering EA the gold standard, the QMRA model could produce the original EA estimate by adjusting mainly the dose‐infection relationship. Considering QMRA the gold standard, the EA model could produce the original QMRA estimate by adjusting mainly the probability that a gastroenteritis case is caused by Campylobacter. A joint analysis of QMRA and EA data and models assuming identical outcomes, using a frequentist or Bayesian approach (using vague priors), resulted in estimates of 102,000 or 123,000 campylobacteriosis cases per year, respectively. These were close to the original EA estimate, and this will be related to the dissimilarity in data availability. The Bayesian approach further showed that attenuating the condition of equal outcomes immediately resulted in very different estimates of the number of campylobacteriosis cases per year and that using more informative priors had little effect on the results. In conclusion, EA was dominant in estimating the burden of campylobacteriosis in the Netherlands. However, it must be noted that only statistical uncertainties were taken into account here. Taking all, usually difficult to quantify, uncertainties into account might lead to a different conclusion.  相似文献   

18.
In order to analyze the role of limited commitment and preference heterogeneity in explaining the consumption allocation, I propose a theoretical and empirical framework to estimate and evaluate a risk‐sharing model where insurance transfers have to be self‐enforcing and the coefficient of relative risk aversion may depend on observable household characteristics. I compare this model to benchmark models with full commitment and/or without preference heterogeneity using data from three Indian villages. I find that the limited commitment model with heterogeneous preferences outperforms the benchmark models in a statistical sense and in terms of (i) explaining the dynamic response of consumption to idiosyncratic income shocks, (ii) accounting for the variation of consumption unexplained by household and time effects, and (iii) capturing the variation of inequality across time and villages and predicting changes in inequality. I also use the estimated models to predict the effects of a counterfactual tax and transfer policy on the consumption allocation. The limited commitment model with preference heterogeneity predicts larger benefits to the poor than its homogeneous counterpart. (JEL: C52, D10, D52)  相似文献   

19.
To quantify the health benefits of environmental policies, economists generally require estimates of the reduced probability of illness or death. For policies that reduce exposure to carcinogenic substances, these estimates traditionally have been obtained through the linear extrapolation of experimental dose-response data to low-exposure scenarios as described in the U.S. Environmental Protection Agency's Guidelines for Carcinogen Risk Assessment (1986). In response to evolving scientific knowledge, EPA proposed revisions to the guidelines in 1996. Under the proposed revisions, dose-response relationships would not be estimated for carcinogens thought to exhibit nonlinear modes of action. Such a change in cancer-risk assessment methods and outputs will likely have serious consequences for how benefit-cost analyses of policies aimed at reducing cancer risks are conducted. Any tendency for reduced quantification of effects in environmental risk assessments, such as those contemplated in the revisions to EPA's cancer-risk assessment guidelines, impedes the ability of economic analysts to respond to increasing calls for benefit-cost analysis. This article examines the implications for benefit-cost analysis of carcinogenic exposures of the proposed changes to the 1986 Guidelines and proposes an approach for bounding dose-response relationships when no biologically based models are available. In spite of the more limited quantitative information provided in a carcinogen risk assessment under the proposed revisions to the guidelines, we argue that reasonable bounds on dose-response relationships can be estimated for low-level exposures to nonlinear carcinogens. This approach yields estimates of reduced illness for use in a benefit-cost analysis while incorporating evidence of nonlinearities in the dose-response relationship. As an illustration, the bounding approach is applied to the case of chloroform exposure.  相似文献   

20.
Food‐borne infection is caused by intake of foods or beverages contaminated with microbial pathogens. Dose‐response modeling is used to estimate exposure levels of pathogens associated with specific risks of infection or illness. When a single dose‐response model is used and confidence limits on infectious doses are calculated, only data uncertainty is captured. We propose a method to estimate the lower confidence limit on an infectious dose by including model uncertainty and separating it from data uncertainty. The infectious dose is estimated by a weighted average of effective dose estimates from a set of dose‐response models via a Kullback information criterion. The confidence interval for the infectious dose is constructed by the delta method, where data uncertainty is addressed by a bootstrap method. To evaluate the actual coverage probabilities of the lower confidence limit, a Monte Carlo simulation study is conducted under sublinear, linear, and superlinear dose‐response shapes that can be commonly found in real data sets. Our model‐averaging method achieves coverage close to nominal in almost all cases, thus providing a useful and efficient tool for accurate calculation of lower confidence limits on infectious doses.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号