首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到17条相似文献,搜索用时 15 毫秒
1.
Analysis of oversight systems is often conducted from a single disciplinary perspective and by using a limited set of criteria for evaluation. In this article, we develop an approach that blends risk analysis, social science, public administration, legal, public policy, and ethical perspectives to develop a broad set of criteria for assessing oversight systems. Multiple methods, including historical analysis, expert elicitation, and behavioral consensus, were employed to develop multidisciplinary criteria for evaluating oversight of emerging technologies. Sixty‐six initial criteria were identified from extensive literature reviews and input from our Working Group. Criteria were placed in four categories reflecting the development, attributes, evolution, and outcomes of oversight systems. Expert elicitation, consensus methods, and multidisciplinary review of the literature were used to refine a condensed, operative set of criteria. Twenty‐eight criteria resulted spanning four categories: seven development criteria, 15 attribute criteria, five outcome criteria, and one evolution criterion. These criteria illuminate how oversight systems develop, operate, change, and affect society. We term our approach “integrated oversight assessment” and propose its use as a tool for analyzing relationships among features, outcomes, and tradeoffs of oversight systems. Comparisons among historical case studies of oversight using a consistent set of criteria should result in defensible and evidence‐supported lessons to guide the development of oversight systems for emerging technologies, such as nanotechnology.  相似文献   

2.
Contaminated sediments and other sites present a difficult challenge for environmental decisionmakers. They are typically slow to recover or attenuate naturally, may involve multiple regulatory agencies and stakeholder groups, and engender multiple toxicological and ecotoxicological risks. While environmental decision-making strategies over the last several decades have evolved into increasingly more sophisticated, information-intensive, and complex approaches, there remains considerable dissatisfaction among business, industry, and the public with existing management strategies. Consequently, contaminated sediments and materials are the subject of intense technology development, such as beneficial reuse or in situ treatment. However, current decision analysis approaches, such as comparative risk assessment, benefit-cost analysis, and life cycle assessment, do not offer a comprehensive approach for incorporating the varied types of information and multiple stakeholder and public views that must typically be brought to bear when new technologies are under consideration. Alternatively, multicriteria decision analysis (MCDA) offers a scientifically sound decision framework for management of contaminated materials or sites where stakeholder participation is of crucial concern and criteria such as economics, environmental impacts, safety, and risk cannot be easily condensed into simple monetary expressions. This article brings together a multidisciplinary review of existing decision-making approaches at regulatory agencies in the United States and Europe and synthesizes state-of-the-art research in MCDA methods applicable to the assessment of contaminated sediment management technologies. Additionally, it tests an MCDA approach for coupling expert judgment and stakeholder values in a hypothetical contaminated sediments management case study wherein MCDA is used as a tool for testing stakeholder responses to and improving expert assessment of innovative contaminated sediments technologies.  相似文献   

3.
The volume and variety of manufactured chemicals is increasing, although little is known about the risks associated with the frequency and extent of human exposure to most chemicals. The EPA and the recent signing of the Lautenberg Act have both signaled the need for high-throughput methods to characterize and screen chemicals based on exposure potential, such that more comprehensive toxicity research can be informed. Prior work of Mitchell et al. using multicriteria decision analysis tools to prioritize chemicals for further research is enhanced here, resulting in a high-level chemical prioritization tool for risk-based screening. Reliable exposure information is a key gap in currently available engineering analytics to support predictive environmental and health risk assessments. An elicitation with 32 experts informed relative prioritization of risks from chemical properties and human use factors, and the values for each chemical associated with each metric were approximated with data from EPA's CP_CAT database. Three different versions of the model were evaluated using distinct weight profiles, resulting in three different ranked chemical prioritizations with only a small degree of variation across weight profiles. Future work will aim to include greater input from human factors experts and better define qualitative metrics.  相似文献   

4.
Multicriteria decision analysis (MCDA) has been applied to various energy problems to incorporate a variety of qualitative and quantitative criteria, usually spanning environmental, social, engineering, and economic fields. MCDA and associated methods such as life‐cycle assessments and cost‐benefit analysis can also include risk analysis to address uncertainties in criteria estimates. One technology now being assessed to help mitigate climate change is carbon capture and storage (CCS). CCS is a new process that captures CO2 emissions from fossil‐fueled power plants and injects them into geological reservoirs for storage. It presents a unique challenge to decisionmakers (DMs) due to its technical complexity, range of environmental, social, and economic impacts, variety of stakeholders, and long time spans. The authors have developed a risk assessment model using a MCDA approach for CCS decisions such as selecting between CO2 storage locations and choosing among different mitigation actions for reducing risks. The model includes uncertainty measures for several factors, utility curve representations of all variables, Monte Carlo simulation, and sensitivity analysis. This article uses a CCS scenario example to demonstrate the development and application of the model based on data derived from published articles and publicly available sources. The model allows high‐level DMs to better understand project risks and the tradeoffs inherent in modern, complex energy decisions.  相似文献   

5.
This article presents a regression‐tree‐based meta‐analysis of rodent pulmonary toxicity studies of uncoated, nonfunctionalized carbon nanotube (CNT) exposure. The resulting analysis provides quantitative estimates of the contribution of CNT attributes (impurities, physical dimensions, and aggregation) to pulmonary toxicity indicators in bronchoalveolar lavage fluid: neutrophil and macrophage count, and lactate dehydrogenase and total protein concentrations. The method employs classification and regression tree (CART) models, techniques that are relatively insensitive to data defects that impair other types of regression analysis: high dimensionality, nonlinearity, correlated variables, and significant quantities of missing values. Three types of analysis are presented: the RT, the random forest (RF), and a random‐forest‐based dose‐response model. The RT shows the best single model supported by all the data and typically contains a small number of variables. The RF shows how much variance reduction is associated with every variable in the data set. The dose‐response model is used to isolate the effects of CNT attributes from the CNT dose, showing the shift in the dose‐response caused by the attribute across the measured range of CNT doses. It was found that the CNT attributes that contribute the most to pulmonary toxicity were metallic impurities (cobalt significantly increased observed toxicity, while other impurities had mixed effects), CNT length (negatively correlated with most toxicity indicators), CNT diameter (significantly positively associated with toxicity), and aggregate size (negatively correlated with cell damage indicators and positively correlated with immune response indicators). Increasing CNT N2‐BET‐specific surface area decreased toxicity indicators.  相似文献   

6.
Alec Morton 《Risk analysis》2011,31(1):129-142
In this article, we compare two high‐profile strategic policy reviews undertaken for the U.K. government on environmental risks: radioactive waste management and climate change. These reviews took very different forms, both in terms of analytic approach and deliberation strategy. The Stern Review on the Economics of Climate Change was largely an exercise in expert modeling, building, within a cost‐benefit framework, an argument for immediate reductions in carbon emissions. The Committee on Radioactive Waste Management, on the other hand, followed a much more explicitly deliberative and participative process, using multicriteria decision analysis to bring together scientific evidence and stakeholder and public values. In this article, we ask why the two reviews were different, and whether the differences are justified. We conclude that the differences were mainly due to political context, rather than the underpinning science, and as a consequence that, while in our view “fit for purpose,” they would both have been stronger had they been less different. Stern's grappling with ethical issues could have been strengthened by a greater degree of public and stakeholder engagement, and the Committee on Radioactive Waste Management's handling of issues of uncertainty could have been strengthened by the explicitly probabilistic framework of Stern.  相似文献   

7.
This paper introduces stochastic dominance as a technique to reduce the set of possible actions that a decision maker must consider in a decision problem under risk. The procedure usually does not choose an optimal action, but instead eliminates certain actions as unacceptable. Very little need be known about the decision maker's utility function. Two possible applications are presented: upgrading buildings to better withstand an earthquake; and choosing a site for a LNG facility.  相似文献   

8.
An Approach to Vulnerability Analysis of Complex Industrial Systems   总被引:3,自引:0,他引:3  
Einarsson  Stefán  Rausand  Marvin 《Risk analysis》1998,18(5):535-546
The concept of vulnerability of complex industrial systems is defined and discussed in relation to risk and system survivability. The discussion is illustrated by referring to a number of previous industrial accidents. The various risk factors, or threats, influencing an industrial system's vulnerability are classified and discussed. Both internal and external threats are covered. The general scope of vulnerability analysis is compared to traditional risk analysis approaches and main differences are illustrated. A general procedure for vulnerability analysis in two steps, including building of scenarios and preparation of relevant worksheets, is described and discussed.  相似文献   

9.
J. W. Owens 《Risk analysis》1997,17(3):359-365
A life-cycle approach takes a cradle-to-grave perspective of a product's numerous activities from the raw material extraction to final disposal. There have been recent efforts to develop life-cycle assessment (LCA) to assess both environmental and human health issues. The question then arises: what are the capabilities of LCA, especially in relation to risk assessment? To address this question, this paper first describes the LCA mass-based accounting system and then analyzes the use of this approach for environmental and human health assessment. The key LCA limitations in this respect are loss of spatial, temporal, dose-response, and threshold information. These limitations affect LCA's capability to assess several environmental issues, and human health in particular. This leads to the conclusion that LCA impact assessment does not predict or measure actual effects, quantitate risks, or address safety. Instead, LCA uses mass loadings with simplifying assumptions and subjective judgments to add independent effects and exposures into an overall score. As a result, LCA identifies possible human health issues on a systemwide basis from a worst case, hypothetical hazard perspective. Ideally, the identified issues would then be addressed by more detailed assessment methods, such as risk assessment.  相似文献   

10.
Access management, which systematically limits opportunities for egress and ingress of vehicles to highway lanes, is critical to protect trillions of dollars of current investment in transportation. This article addresses allocating resources for access management with incomplete and partially relevant data on crash rates, travel speeds, and other factors. While access management can be effective to avoid crashes, reduce travel times, and increase route capacities, the literature suggests a need for performance metrics to guide investments in resource allocation across large corridor networks and several time horizons. In this article, we describe a quantitative decision model to support an access management program via risk‐cost‐benefit analysis under data uncertainties from diverse sources of data and expertise. The approach quantifies potential benefits, including safety improvement and travel time savings, and costs of access management through functional relationships of input parameters including crash rates, corridor access point densities, and traffic volumes. Parameter uncertainties, which vary across locales and experts, are addressed via numerical interval analyses. This approach is demonstrated at several geographic scales across 7,000 kilometers of highways in a geographic region and several subregions. The demonstration prioritizes route segments that would benefit from risk management, including (i) additional data or elicitation, (ii) right‐of‐way purchases, (iii) restriction or closing of access points, (iv) new alignments, (v) developer proffers, and (vi) etc. The approach ought to be of wide interest to analysts, planners, policymakers, and stakeholders who rely on heterogeneous data and expertise for risk management.  相似文献   

11.
In emergent photovoltaics, nanoscale materials hold promise for optimizing device characteristics; however, the related impacts remain uncertain, resulting in challenges to decisions on strategic investment in technology innovation. We integrate multi‐criteria decision analysis (MCDA) and life‐cycle assessment (LCA) results (LCA‐MCDA) as a method of incorporating values of a hypothetical federal acquisition manager into the assessment of risks and benefits of emerging photovoltaic materials. Specifically, we compare adoption of copper zinc tin sulfide (CZTS) devices with molybdenum back contacts to alternative devices employing graphite or graphene instead of molybdenum. LCA impact results are interpreted alongside benefits of substitution including cost reductions and performance improvements through application of multi‐attribute utility theory. To assess the role of uncertainty we apply Monte Carlo simulation and sensitivity analysis. We find that graphene or graphite back contacts outperform molybdenum under most scenarios and assumptions. The use of decision analysis clarifies potential advantages of adopting graphite as a back contact while emphasizing the importance of mitigating conventional impacts of graphene production processes if graphene is used in emerging CZTS devices. Our research further demonstrates that a combination of LCA and MCDA increases the usability of LCA in assessing product sustainability. In particular, this approach identifies the most influential assumptions and data gaps in the analysis and the areas in which either engineering controls or further data collection may be necessary.  相似文献   

12.
Cost‐benefit analysis (CBA) is commonly applied as a tool for deciding on risk protection. With CBA, one can identify risk mitigation strategies that lead to an optimal tradeoff between the costs of the mitigation measures and the achieved risk reduction. In practical applications of CBA, the strategies are typically evaluated through efficiency indicators such as the benefit‐cost ratio (BCR) and the marginal cost (MC) criterion. In many of these applications, the BCR is not consistently defined, which, as we demonstrate in this article, can lead to the identification of suboptimal solutions. This is of particular relevance when the overall budget for risk reduction measures is limited and an optimal allocation of resources among different subsystems is necessary. We show that this problem can be formulated as a hierarchical decision problem, where the general rules and decisions on the available budget are made at a central level (e.g., central government agency, top management), whereas the decisions on the specific measures are made at the subsystem level (e.g., local communities, company division). It is shown that the MC criterion provides optimal solutions in such hierarchical optimization. Since most practical applications only include a discrete set of possible risk protection measures, the MC criterion is extended to this situation. The findings are illustrated through a hypothetical numerical example. This study was prepared as part of our work on the optimal management of natural hazard risks, but its conclusions also apply to other fields of risk management.  相似文献   

13.
On the basis of the combination of the well‐known knapsack problem and a widely used risk management technique in organizations (that is, the risk matrix), an approach was developed to carry out a cost‐benefits analysis to efficiently take prevention investment decisions. Using the knapsack problem as a model and combining it with a well‐known technique to solve this problem, bundles of prevention measures are prioritized based on their costs and benefits within a predefined prevention budget. Those bundles showing the highest efficiencies, and within a given budget, are identified from a wide variety of possible alternatives. Hence, the approach allows for an optimal allocation of safety resources, does not require any highly specialized information, and can therefore easily be applied by any organization using the risk matrix as a risk ranking tool.  相似文献   

14.
The U.S. electric power system is increasingly vulnerable to the adverse impacts of extreme climate events. Supply inadequacy risk can result from climate‐induced shifts in electricity demand and/or damaged physical assets due to hydro‐meteorological hazards and climate change. In this article, we focus on the risks associated with the unanticipated climate‐induced demand shifts and propose a data‐driven approach to identify risk factors that render the electricity sector vulnerable in the face of future climate variability and change. More specifically, we have leveraged advanced supervised learning theory to identify the key predictors of climate‐sensitive demand in the residential, commercial, and industrial sectors. Our analysis indicates that variations in mean dew point temperature is the common major risk factor across all the three sectors. We have also conducted a statistical sensitivity analysis to assess the variability in the projected demand as a function of the key climate risk factor. We then propose the use of scenario‐based heat maps as a tool to communicate the inadequacy risks to stakeholders and decisionmakers. While we use the state of Ohio as a case study, our proposed approach is equally applicable to all other states.  相似文献   

15.
Land subsidence risk assessment (LSRA) is a multi‐attribute decision analysis (MADA) problem and is often characterized by both quantitative and qualitative attributes with various types of uncertainty. Therefore, the problem needs to be modeled and analyzed using methods that can handle uncertainty. In this article, we propose an integrated assessment model based on the evidential reasoning (ER) algorithm and fuzzy set theory. The assessment model is structured as a hierarchical framework that regards land subsidence risk as a composite of two key factors: hazard and vulnerability. These factors can be described by a set of basic indicators defined by assessment grades with attributes for transforming both numerical data and subjective judgments into a belief structure. The factor‐level attributes of hazard and vulnerability are combined using the ER algorithm, which is based on the information from a belief structure calculated by the Dempster‐Shafer (D‐S) theory, and a distributed fuzzy belief structure calculated by fuzzy set theory. The results from the combined algorithms yield distributed assessment grade matrices. The application of the model to the Xixi‐Chengnan area, China, illustrates its usefulness and validity for LSRA. The model utilizes a combination of all types of evidence, including all assessment information—quantitative or qualitative, complete or incomplete, and precise or imprecise—to provide assessment grades that define risk assessment on the basis of hazard and vulnerability. The results will enable risk managers to apply different risk prevention measures and mitigation planning based on the calculated risk states.  相似文献   

16.
《Risk analysis》2018,38(4):826-838
Phishing risk is a growing area of concern for corporations, governments, and individuals. Given the evidence that users vary widely in their vulnerability to phishing attacks, we demonstrate an approach for assessing the benefits and costs of interventions that target the most vulnerable users. Our approach uses Monte Carlo simulation to (1) identify which users were most vulnerable, in signal detection theory terms; (2) assess the proportion of system‐level risk attributable to the most vulnerable users; (3) estimate the monetary benefit and cost of behavioral interventions targeting different vulnerability levels; and (4) evaluate the sensitivity of these results to whether the attacks involve random or spear phishing. Using parameter estimates from previous research, we find that the most vulnerable users were less cautious and less able to distinguish between phishing and legitimate emails (positive response bias and low sensitivity, in signal detection theory terms). They also accounted for a large share of phishing risk for both random and spear phishing attacks. Under these conditions, our analysis estimates much greater net benefit for behavioral interventions that target these vulnerable users. Within the range of the model's assumptions, there was generally net benefit even for the least vulnerable users. However, the differences in the return on investment for interventions with users with different degrees of vulnerability indicate the importance of measuring that performance, and letting it guide interventions. This study suggests that interventions to reduce response bias, rather than to increase sensitivity, have greater net benefit.  相似文献   

17.
In this article, we examine the effects of shortcuts in the development of engineered systems through a principal-agent model. We find that occurrences of illicit shortcuts are closely related to the incentive structure and to the level of effort that the agent is willing to expend from the beginning of the project to remain on schedule. Using a probabilistic risk analysis to determine the risks of system failure from these shortcuts, we show how a principal can choose optimal settings (payments, penalties, and inspections) that can deter an agent from cutting corners and maximize the principal's value through increased agent effort. We analyze the problem for an agent with limited liability. We consider first the case where he is risk neutral; we then include the case where he is risk averse.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号