首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 109 毫秒
1.
Rozell DJ  Reaven SJ 《Risk analysis》2012,32(8):1382-1393
In recent years, shale gas formations have become economically viable through the use of horizontal drilling and hydraulic fracturing. These techniques carry potential environmental risk due to their high water use and substantial risk for water pollution. Using probability bounds analysis, we assessed the likelihood of water contamination from natural gas extraction in the Marcellus Shale. Probability bounds analysis is well suited when data are sparse and parameters highly uncertain. The study model identified five pathways of water contamination: transportation spills, well casing leaks, leaks through fractured rock, drilling site discharge, and wastewater disposal. Probability boxes were generated for each pathway. The potential contamination risk and epistemic uncertainty associated with hydraulic fracturing wastewater disposal was several orders of magnitude larger than the other pathways. Even in a best-case scenario, it was very likely that an individual well would release at least 200 m3 of contaminated fluids. Because the total number of wells in the Marcellus Shale region could range into the tens of thousands, this substantial potential risk suggested that additional steps be taken to reduce the potential for contaminated fluid leaks. To reduce the considerable epistemic uncertainty, more data should be collected on the ability of industrial and municipal wastewater treatment facilities to remove contaminants from used hydraulic fracturing fluid.  相似文献   

2.
In the event of contamination of a water distribution system, decisions must be made to mitigate the impact of the contamination and to protect public health. Making threat management decisions while a contaminant spreads through the network is a dynamic and interactive process. Response actions taken by the utility managers and water consumption choices made by the consumers will affect the hydraulics, and thus the spread of the contaminant plume, in the network. A modeling framework that allows the simulation of a contamination event under the effects of actions taken by utility managers and consumers will be a useful tool for the analysis of alternative threat mitigation and management strategies. This article presents a multiagent modeling framework that combines agent‐based, mechanistic, and dynamic methods. Agents select actions based on a set of rules that represent an individual's autonomy, goal‐based desires, and reaction to the environment and the actions of other agents. Consumer behaviors including ingestion, mobility, reduction of water demands, and word‐of‐mouth communication are simulated. Management strategies are evaluated, including opening hydrants to flush the contaminant and broadcasts. As actions taken by consumer agents and utility operators affect demands and flows in the system, the mechanistic model is updated. Management strategies are evaluated based on the exposure of the population to the contaminant. The framework is designed to consider the typical issues involved in water distribution threat management and provides valuable analysis of threat containment strategies for water distribution system contamination events.  相似文献   

3.
An approach, using biomarkers (biological responses) for assessing the biological and ecological significance of contaminants present in the environment is described. Living organisms integrate exposure to contaminants in their environment and respond in some measurable and predictable way. Responses are observed at several levels of biological organization from the biomolecular level, where pollutants can cause damage to critical cellular macromolecules and elicit defensive strategies such as detoxication and repair mechanisms, to the organismal level, where severe disturbances are manifested as impairment in growth, reproduction, developmental abnormalities, or decreased survival. Biomarkers can provide not only evidence of exposure to a broad spectrum of anthropogenic chemicals, but also a temporally integrated measure of bioavailable contaminant levels. A suite of biomarkers are evaluated over time to determine the magnitude of the problem and possible consequences. Relationships between biomarker response and adverse ecological effects are determined from estimates of animal health and population structure.  相似文献   

4.
The decision process involved in cleaning sites contaminated with hazardous, mixed, and radioactive materials is supported often by results obtained from computer models. These results provide limits within which a decision-maker can judge the importance of individual transport and fate processes, and the likely outcome of alternative cleanup strategies. The transport of hazardous materials may occur predominately through one particular pathway but, more often, actual or potential transport must be evaluated across several pathways and media. Multimedia models are designed to simulate the transport of contaminants from a source to a receptor through more than one environmental pathway. Three such multimedia models are reviewed here: MEPAS, MMSOILS, and PRESTO-EPA-CPG. The reviews are based on documentation provided with the software, on published reviews, on personal interviews with the model developers, and on model summaries extracted from computer databases and expert systems. The three models are reviewed within the context of specific media components: air, surface water, ground water, and food chain. Additional sections evaluate the way that these three models calculate human exposure and dose and how they report uncertainty. Special emphasis is placed on how each model handles radio-nuclide transport within specific media. For the purpose of simulating the transport, fate and effects of radioactive contaminants through more than one pathway, both MEPAS and PRESTO-EPA-CPG are adequate for screening studies; MMSOILS only handles nonradioactive substances and must be modified before it can be used in these same applications. Of the three models, MEPAS is the most versatile, especially if the user needs to model the transport, fate, and effects of hazardous and radioactive contaminants.  相似文献   

5.
Federal and state drinking-water standards and guidelines do not exist for many contaminants analyzed by the U.S. Geological Survey's National Water-Quality Assessment Program, limiting the ability to evaluate the potential human-health relevance of water-quality findings. Health-based screening levels (HBSLs) were developed collaboratively to supplement existing drinking-water standards and guidelines as part of a six-year, multi-agency pilot study. The pilot study focused on ground water samples collected prior to treatment or blending in areas of New Jersey where groundwater is the principal source of drinking water. This article describes how HBSLs were developed and demonstrates the use of HBSLs as a tool for evaluating water-quality data in a human-health context. HBSLs were calculated using standard U.S. Environmental Protection Agency (USEPA) methodologies and toxicity information. New HBSLs were calculated for 12 of 32 contaminants without existing USEPA drinking-water standards or guidelines, increasing the number of unregulated contaminants (those without maximum contaminant levels (MCLs)) with human-health benchmarks. Concentrations of 70 of the 78 detected contaminants with human-health benchmarks were less than MCLs or HBSLs, including all 12 contaminants with new HBSLs, suggesting that most contaminant concentrations were not of potential human-health concern. HBSLs were applied to a state-scale groundwater data set in this study, but HBSLs also may be applied to regional and national evaluations of water-quality data. HBSLs fulfill a critical need for federal, state, and local agencies, water utilities, and others who seek tools for evaluating the occurrence of contaminants without drinking-water standards or guidelines.  相似文献   

6.
Regulatory agencies often perform microbial risk assessments to evaluate the change in the number of human illnesses as the result of a new policy that reduces the level of contamination in the food supply. These agencies generally have regulatory authority over the production and retail sectors of the farm‐to‐table continuum. Any predicted change in contamination that results from new policy that regulates production practices occurs many steps prior to consumption of the product. This study proposes a framework for conducting microbial food‐safety risk assessments; this framework can be used to quantitatively assess the annual effects of national regulatory policies. Advantages of the framework are that estimates of human illnesses are consistent with national disease surveillance data (which are usually summarized on an annual basis) and some of the modeling steps that occur between production and consumption can be collapsed or eliminated. The framework leads to probabilistic models that include uncertainty and variability in critical input parameters; these models can be solved using a number of different Bayesian methods. The Bayesian synthesis method performs well for this application and generates posterior distributions of parameters that are relevant to assessing the effect of implementing a new policy. An example, based on Campylobacter and chicken, estimates the annual number of illnesses avoided by a hypothetical policy; this output could be used to assess the economic benefits of a new policy. Empirical validation of the policy effect is also examined by estimating the annual change in the numbers of illnesses observed via disease surveillance systems.  相似文献   

7.
A. Pielaat 《Risk analysis》2011,31(9):1434-1450
A novel purpose of the use of mathematical models in quantitative microbial risk assessment (QMRA) is to identify the sources of microbial contamination in a food chain (i.e., biotracing). In this article we propose a framework for the construction of a biotracing model, eventually to be used in industrial food production chains where discrete numbers of products are processed that may be contaminated by a multitude of sources. The framework consists of steps in which a Monte Carlo model, simulating sequential events in the chain following a modular process risk modeling (MPRM) approach, is converted to a Bayesian belief network (BBN). The resulting model provides a probabilistic quantification of concentrations of a pathogen throughout a production chain. A BBN allows for updating the parameters of the model based on observational data, and global parameter sensitivity analysis is readily performed in a BBN. Moreover, a BBN enables “backward reasoning” when downstream data are available and is therefore a natural framework for answering biotracing questions. The proposed framework is illustrated with a biotracing model of Salmonella in the pork slaughter chain, based on a recently published Monte Carlo simulation model. This model, implemented as a BBN, describes the dynamics of Salmonella in a Dutch slaughterhouse and enables finding the source of contamination of specific carcasses at the end of the chain.  相似文献   

8.
Government and private sector organizations are increasingly turning to the use of maps and other visual models to provide a depiction of environmental hazards and the potential risks they represent to humans and ecosystems. Frequently, the graphic presentation is tailored to address a specific contaminant, its location and possible exposure pathways, and potential receptors. Its format is usually driven by the data available, choice of graphics technology, and the audience being served. A format that is effective for displaying one contaminant at one scale at one site, however, may be ineffective in accurately portraying the circumstances surrounding a different contaminant at the same site, or the same contaminant at a different site, because of limitations in available data or the graphics technology being used. This is the daunting challenge facing the U.S. Department of Energy (DOE), which is responsible for the nation's legacy wastes from nuclear weapons research, testing, and production at over 100 sites in the United States. In this article, we discuss the development and use of integrated geospatial mapping and conceptual site models to identify hazards and evaluate alternative long-term environmental clean-up strategies at DOE sites located across the United States. While the DOE probably has the greatest need for such information, the Department of Defense and other public and private responsible parties for many large and controversial National Priority List or Superfund sites would benefit from a similar approach.  相似文献   

9.
A conceptual framework is presented for conducting exposure assessments under the U.S. EPA's Voluntary Children's Chemical Evaluation Program (VCCEP). The VCCEP is a voluntary program whereby companies that manufacture chemicals of potential concern are asked to conduct hazard, exposure, and risk assessments for the chemicals. The VCCEP is unique in its risk-based, tiered approach, and because it focuses on children and requires a comprehensive consideration of all reasonably foreseeable exposure pathways for a particular chemical. The consideration of all potential exposure pathways for some commonly used chemicals presents a daunting challenge for the exposure assessor. This article presents a framework for managing this complicated process, and illustrates the application of the framework with a hypothetical case study. The framework provides guidance for interpreting multiple sources of exposure information and developing a plausible list of exposure pathways for a chemical. Furthermore, the framework provides a means to process all the available information to eliminate pathways of negligible concern from consideration. Finally, the framework provides guidance for utilizing the tiered approach of VCCEP to efficiently conduct an assessment by first using simple, screening-level approaches and then, if necessary, using more complex, refined exposure assessment methods. The case study provides an illustration of the major concepts.  相似文献   

10.
The evaluation of the risk of water quality failures in a distribution network is a challenging task given that much of the available data are highly uncertain and vague, and many of the mechanisms are not fully understood. Consequently, a systematic approach is required to handle quantitative-qualitative data as well as a means to update existing information when new knowledge and data become available. Five general pathways (mechanisms) through which a water quality failure can occur in the distribution network are identified in this article. These include contaminant intrusion, leaching and corrosion, biofilm formation and microbial regrowth, permeation, and water treatment breakthrough (including disinfection byproducts formation). The proposed methodology is demonstrated using a simplified example for water quality failures in a distribution network. This article builds upon the previous developments of aggregative risk analysis approach. Each basic risk item in a hierarchical framework is expressed by a triangular fuzzy number, which is derived from the composition of the likelihood of a failure event and the associated failure consequence . An analytic hierarchy process is used to estimate weights required for grouping noncommensurate risk sources. The evidential reasoning is proposed to incorporate newly arrived data for the updating of existing risk estimates. The exponential ordered weighted averaging operators are used for defuzzification to incorporate attitudinal dimension for risk management. It is envisaged that the proposed approach could serve as a basis to benchmark acceptable risks in water distribution networks.  相似文献   

11.
Thekdi SA  Lambert JH 《Risk analysis》2012,32(7):1253-1269
Coordination and layering of models to identify risks in complex systems such as large-scale infrastructure of energy, water, and transportation is of current interest across application domains. Such infrastructures are increasingly vulnerable to adjacent commercial and residential land development. Land development can compromise the performance of essential infrastructure systems and increase the costs of maintaining or increasing performance. A risk-informed approach to this topic would be useful to avoid surprise, regret, and the need for costly remedies. This article develops a layering and coordination of models for risk management of land development affecting infrastructure systems. The layers are: system identification, expert elicitation, predictive modeling, comparison of investment alternatives, and implications of current decisions for future options. The modeling layers share a focus on observable factors that most contribute to volatility of land development and land use. The relevant data and expert evidence include current and forecasted growth in population and employment, conservation and preservation rules, land topography and geometries, real estate assessments, market and economic conditions, and other factors. The approach integrates to a decision framework of strategic considerations based on assessing risk, cost, and opportunity in order to prioritize needs and potential remedies that mitigate impacts of land development to the infrastructure systems. The approach is demonstrated for a 5,700-mile multimodal transportation system adjacent to 60,000 tracts of potential land development.  相似文献   

12.
In the days following the collapse of the World Trade Center (WTC) towers on September 11, 2001 (9/11), the U.S. Environmental Protection Agency (EPA) initiated numerous air monitoring activities to better understand the ongoing impact of emissions from that disaster. Using these data, EPA conducted an inhalation exposure and human health risk assessment to the general population. This assessment does not address exposures and potential impacts that could have occurred to rescue workers, firefighters, and other site workers, nor does it address exposures that could have occurred in the indoor environment. Contaminants evaluated include particulate matter (PM), metals, polychlorinated biphenyls, dioxins, asbestos, volatile organic compounds, particle-bound polycyclic aromatic hydrocarbons, silica, and synthetic vitreous fibers (SVFs). This evaluation yielded three principal findings. (1) Persons exposed to extremely high levels of ambient PM and its components, SVFs, and other contaminants during the collapse of the WTC towers, and for several hours afterward, were likely to be at risk for acute and potentially chronic respiratory effects. (2) Available data suggest that contaminant concentrations within and near ground zero (GZ) remained significantly elevated above background levels for a few days after 9/11. Because only limited data on these critical few days were available, exposures and potential health impacts could not be evaluated with certainty for this time period. (3) Except for inhalation exposures that may have occurred on 9/11 and a few days afterward, the ambient air concentration data suggest that persons in the general population were unlikely to suffer short-term or long-term adverse health effects caused by inhalation exposures. While this analysis by EPA evaluated the potential for health impacts based on measured air concentrations, epidemiological studies conducted by organizations other than EPA have attempted to identify actual impacts. Such studies have identified respiratory effects in worker and general populations, and developmental effects in newborns whose mothers were near GZ on 9/11 or shortly thereafter. While researchers are not able to identify specific times and even exactly which contaminants are the cause of these effects, they have nonetheless concluded that exposure to WTC contaminants (and/or maternal stress, in the case of developmental effects) resulted in these effects, and have identified the time period including 9/11 itself and the days and few weeks afterward as a period of most concern based on high concentrations of key pollutants in the air and dust.  相似文献   

13.
Cheese smearing is a complex process and the potential for cross-contamination with pathogenic or undesirable microorganisms is critical. During ripening, cheeses are salted and washed with brine to develop flavor and remove molds that could develop on the surfaces. Considering the potential for cross-contamination of this process in quantitative risk assessments could contribute to a better understanding of this phenomenon and, eventually, improve its control. The purpose of this article is to model the cross-contamination of smear-ripened cheeses due to the smearing operation under industrial conditions. A compartmental, dynamic, and stochastic model is proposed for mechanical brush smearing. This model has been developed to describe the exchange of microorganisms between compartments. Based on the analytical solution of the model equations and on experimental data collected with an industrial smearing machine, we assessed the values of the transfer parameters of the model. Monte Carlo simulations, using the distributions of transfer parameters, provide the final number of contaminated products in a batch and their final level of contamination for a given scenario taking into account the initial number of contaminated cheeses of the batch and their contaminant load. Based on analytical results, the model provides indicators for smearing efficiency and propensity of the process for cross-contamination. Unlike traditional approaches in mechanistic models, our approach captures the variability and uncertainty inherent in the process and the experimental data. More generally, this model could represent a generic base to use in modeling similar processes prone to cross-contamination.  相似文献   

14.
Food web models have two uses in assessments of environmental contaminants. First, they are used to determine whether remediation is needed by estimating exposure of end-point species and subsequent effects. Second, they are used to establish cleanup goals by estimating concentrations of contaminants in ambient media that will not cause significant effects. This paper demonstrates how achievement of these goals can be enhanced by the use of stochastic food web models. The models simulate the dynamics of PCBs and mercury in the food webs of mink and great blue herons. All parameters of the models are treated as having knowledge uncertainty, due to imperfect knowledge of the actual parameter values for the site, chemicals, and species of interest. This uncertainty is an indicator of the potential value of additional measurements. In addition, those parameters that are responsible for variance among individual organisms are assigned stochastic uncertainty. This uncertainty indicates the range of body burdens that are expected when the end-point species are monitored. These two types of uncertainty are separately accounted for in Monte Carlo simulations of the models. Preliminary monitoring results indicate that the models give reasonably good estimates of heron egg and nestling body burdens and of variance among individuals.  相似文献   

15.
《Risk analysis》1996,16(6):841-848
Currently, risk assessments of the potential human health effects associated with exposure to pathogens are utilizing the conceptual framework that was developed to assess risks associated with chemical exposures. However, the applicability of the chemical framework is problematic due to many issues that are unique to assessing risks associated with pathogens. These include, but are not limited to, an assessment of pathogen/host interactions, consideration of secondary spread, consideration of short- and long-term immunity, and an assessment of conditions that allow the microorganism to propagate. To address this concern, a working group was convened to develop a conceptual framework to assess the risks of human disease associated with exposure to pathogenic microorganisms. The framework that was developed consists of three phases: problem formulation, analysis (which includes characterization of exposure and human health effects), and risk characterization. The framework emphasizes the dynamic and iterative nature of the risk assessment process, and allows wide latitude for planning and conducting risk assessments in diverse situations, each based on the common principles discussed in the framework.  相似文献   

16.
A novel approach to the quantitative assessment of food-borne risks is proposed. The basic idea is to use Bayesian techniques in two distinct steps: first by constructing a stochastic core model via a Bayesian network based on expert knowledge, and second, using the data available to improve this knowledge. Unlike the Monte Carlo simulation approach as commonly used in quantitative assessment of food-borne risks where data sets are used independently in each module, our consistent procedure incorporates information conveyed by data throughout the chain. It allows "back-calculation" in the food chain model, together with the use of data obtained "downstream" in the food chain. Moreover, the expert knowledge is introduced more simply and consistently than with classical statistical methods. Other advantages of this approach include the clear framework of an iterative learning process, considerable flexibility enabling the use of heterogeneous data, and a justified method to explore the effects of variability and uncertainty. As an illustration, we present an estimation of the probability of contracting a campylobacteriosis as a result of broiler contamination, from the standpoint of quantitative risk assessment. Although the model thus constructed is oversimplified, it clarifies the principles and properties of the method proposed, which demonstrates its ability to deal with quite complex situations and provides a useful basis for further discussions with different experts in the food chain.  相似文献   

17.
In Part 1 of this article we developed an approach for the calculation of cancer effect measures for life cycle assessment (LCA). In this article, we propose and evaluate the method for the screening of noncancer toxicological health effects. This approach draws on the noncancer health risk assessment concept of benchmark dose, while noting important differences with regulatory applications in the objectives of an LCA study. We adopt the centraltendency estimate of the toxicological effect dose inducing a 10% response over background, ED10, to provide a consistent point of departure for default linear low-dose response estimates (betaED10). This explicit estimation of low-dose risks, while necessary in LCA, is in marked contrast to many traditional procedures for noncancer assessments. For pragmatic reasons, mechanistic thresholds and nonlinear low-dose response curves were not implemented in the presented framework. In essence, for the comparative needs of LCA, we propose that one initially screens alternative activities or products on the degree to which the associated chemical emissions erode their margins of exposure, which may or may not be manifested as increases in disease incidence. We illustrate the method here by deriving the betaED10 slope factors from bioassay data for 12 chemicals and outline some of the possibilities for extrapolation from other more readily available measures, such as the no observable adverse effect levels (NOAEL), avoiding uncertainty factors that lead to inconsistent degrees of conservatism from chemical to chemical. These extrapolations facilitated the initial calculation of slope factors for an additional 403 compounds; ranging from 10(-6) to 10(3) (risk per mg/kg-day dose). The potential consequences of the effects are taken into account in a preliminary approach by combining the betaED10 with the severity measure disability adjusted life years (DALY), providing a screening-level estimate of the potential consequences associated with exposures, integrated over time and space, to a given mass of chemical released into the environment for use in LCA.  相似文献   

18.
This paper presents an approach for characterizing the probability of adverse effects occurring in a population exposed to dose rates in excess of the Reference Dose (RfD). The approach uses a linear threshold (hockey stick) model of response and is based on the current system of uncertainty factors used in setting RfDs. The approach requires generally available toxicological estimates such as No-Observed-Adverse-Effect Levels (NOAELs) or Benchmark Doses and doses at which adverse effects are observed in 50% of the test animals (ED50s). In this approach, Monte Carlo analysis is used to characterize the uncertainty in the dose response slope based on the range and magnitude of the key sources of uncertainty in setting protective doses. The method does not require information on the shape of the dose response curve for specific chemicals, but is amenable to the inclusion of such data. The approach is applied to four compounds to produce estimates of response rates for dose rates greater than the RfD  相似文献   

19.
The Fukushima Daiichi accident released huge amounts of radioactive material over a wide area. We can appreciate the geographical extent of radioactive contamination from the information published online by the Japanese government. Historically, this is an unprecedented situation, which allows “natural experimentation” to estimate the causal effects of radioactive contamination on our society. This study focused on property value losses caused by the accident and analyzed changes in land appraisals around the Fukushima Daiichi plant from July 2010 to July 2011 within the framework of hedonic approach. Thus, we estimated the short‐run impact of the contamination or the change in marginal value of proximity to the plant. The results suggest that the appraisals significantly and monotonically depreciated with increasing contamination levels. However, there was no evidence to suggest changes in the marginal value of proximity to the plant. A comparison between the appraisals and transaction prices indicates that this result could be interpreted as an underestimate of actual property value losses.  相似文献   

20.
Reliability and higher levels of safety are thought to be achieved by using systematic approaches to managing risks. The assessment of risks has produced a range of different approaches to assessing these uncertainties, presenting models for how risks affect individuals or organizations. Contemporary risk assessment tools based on this approach have proven difficult for practitioners to use as tools for tactical and operational decision making. This article presents an alternative to these assessments by utilizing a resilience perspective, arguing that complex systems are inclined to variety and uncertainty regarding the results they produce and are therefore prone to systemic failures. A continuous improvement approach is a source of reliability when managing complex systems and is necessary to manage varieties and uncertainties. For an organization to understand how risk events occur, it is necessary to define what is believed to be the equilibrium of the system in time and space. By applying a resilience engineering (RE) perspective to risk assessment, it is possible to manage this complexity by assessing the ability to respond, monitor, learn, and anticipate risks, and in so doing to move away from the flawed frequency and consequences approach. Using a research station network in the Arctic as an example illustrates how an RE approach qualifies assessments by bridging risk assessments with value-creation processes. The article concludes by arguing that a resilience-based risk assessment can improve on current practice, including for organizations located outside the Arctic region.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号