首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 0 毫秒
1.
In this note I reply to the comments by Haimes et al. on my paper on the sensitivity analysis of the inoperability input‐output model. I make the case for a moment‐independent sensitivity analysis.  相似文献   

2.
Outbreaks of contagious diseases underscore the ever‐looming threat of new epidemics. Compared to other disasters that inflict physical damage to infrastructure systems, epidemics can have more devastating and prolonged impacts on the population. This article investigates the interdependent economic and productivity risks resulting from epidemic‐induced workforce absenteeism. In particular, we develop a dynamic input‐output model capable of generating sector‐disaggregated economic losses based on different magnitudes of workforce disruptions. An ex post analysis of the 2009 H1N1 pandemic in the national capital region (NCR) reveals the distribution of consequences across different economic sectors. Consequences are categorized into two metrics: (i) economic loss, which measures the magnitude of monetary losses incurred in each sector, and (ii) inoperability, which measures the normalized monetary losses incurred in each sector relative to the total economic output of that sector. For a simulated mild pandemic scenario in NCR, two distinct rankings are generated using the economic loss and inoperability metrics. Results indicate that the majority of the critical sectors ranked according to the economic loss metric comprise of sectors that contribute the most to the NCR's gross domestic product (e.g., federal government enterprises). In contrast, the majority of the critical sectors generated by the inoperability metric include sectors that are involved with epidemic management (e.g., hospitals). Hence, prioritizing sectors for recovery necessitates consideration of the balance between economic loss, inoperability, and other objectives. Although applied specifically to the NCR, the proposed methodology can be customized for other regions.  相似文献   

3.
Input‐output analysis is frequently used in studies of large‐scale weather‐related (e.g., Hurricanes and flooding) disruption of a regional economy. The economy after a sudden catastrophe shows a multitude of imbalances with respect to demand and production and may take months or years to recover. However, there is no consensus about how the economy recovers. This article presents a theoretical route map for imbalanced economic recovery called dynamic inequalities. Subsequently, it is applied to a hypothetical postdisaster economic scenario of flooding in London around the year 2020 to assess the influence of future shocks to a regional economy and suggest adaptation measures. Economic projections are produced by a macro econometric model and used as baseline conditions. The results suggest that London's economy would recover over approximately 70 months by applying a proportional rationing scheme under the assumption of initial 50% labor loss (with full recovery in six months), 40% initial loss to service sectors, and 10–30% initial loss to other sectors. The results also suggest that imbalance will be the norm during the postdisaster period of economic recovery even though balance may occur temporarily. Model sensitivity analysis suggests that a proportional rationing scheme may be an effective strategy to apply during postdisaster economic reconstruction, and that policies in transportation recovery and in health care are essential for effective postdisaster economic recovery.  相似文献   

4.
Influenza pandemic is a serious disaster that can pose significant disruptions to the workforce and associated economic sectors. This article examines the impact of influenza pandemic on workforce availability within an interdependent set of economic sectors. We introduce a simulation model based on the dynamic input‐output model to capture the propagation of pandemic consequences through the National Capital Region (NCR). The analysis conducted in this article is based on the 2009 H1N1 pandemic data. Two metrics were used to assess the impacts of the influenza pandemic on the economic sectors: (i) inoperability, which measures the percentage gap between the as‐planned output and the actual output of a sector, and (ii) economic loss, which quantifies the associated monetary value of the degraded output. The inoperability and economic loss metrics generate two different rankings of the critical economic sectors. Results show that most of the critical sectors in terms of inoperability are sectors that are related to hospitals and health‐care providers. On the other hand, most of the sectors that are critically ranked in terms of economic loss are sectors with significant total production outputs in the NCR such as federal government agencies. Therefore, policy recommendations relating to potential mitigation and recovery strategies should take into account the balance between the inoperability and economic loss metrics.  相似文献   

5.
In this work, we introduce a generalized rationale for local sensitivity analysis (SA) methods that allows to solve the problems connected with input constraints. Several models in use in the risk analysis field are characterized by the presence of deterministic relationships among the input parameters. However, SA issues related to the presence of constraints have been mainly dealt with in a heuristic fashion. We start with a systematic analysis of the effects of constraints. The findings can be summarized in the following three effects. (i) Constraints make it impossible to vary one parameter while keeping all others fixed. (ii) The model output becomes insensitive to a parameter if a constraint is solved for that parameter. (iii) Sensitivity analysis results depend on which parameter is selected as dependent. The explanation of these effects is found by proposing a result that leads to a natural extension of the local SA rationale introduced in Helton (1993) . We then extend the definitions of the Birnbaum, criticality, and the differential importance measures to the constrained case. In addition, a procedure is introduced that allows to obtain constrained sensitivity results at the same cost as in the absence of constraints. The application to a nonbinary event tree concludes the article, providing a numerical illustration of the above findings.  相似文献   

6.
Disruptive events such as natural disasters, loss or reduction of resources, work stoppages, and emergent conditions have potential to propagate economic losses across trade networks. In particular, disruptions to the operation of container port activity can be detrimental for international trade and commerce. Risk assessment should anticipate the impact of port operation disruptions with consideration of how priorities change due to uncertain scenarios and guide investments that are effective and feasible for implementation. Priorities for protective measures and continuity of operations planning must consider the economic impact of such disruptions across a variety of scenarios. This article introduces new performance metrics to characterize resiliency in interdependency modeling and also integrates scenario‐based methods to measure economic sensitivity to sudden‐onset disruptions. The methods will be demonstrated on a U.S. port responsible for handling $36.1 billion of cargo annually. The methods will be useful to port management, private industry supply chain planning, and transportation infrastructure management.  相似文献   

7.
Interdependency analysis in the context of this article is a process of assessing and managing risks inherent in a system of interconnected entities (e.g., infrastructures or industry sectors). Invoking the principles of input-output (I-O) and decomposition analysis, the article offers a framework for describing how terrorism-induced perturbations can propagate due to interconnectedness. Data published by the Bureau of Economic Analysis Division of the U.S. Department of Commerce is utilized to present applications to serve as test beds for the proposed framework. Specifically, a case study estimating the economic impact of airline demand perturbations to national-level U.S. sectors is made possible using I-O matrices. A ranking of the affected sectors according to their vulnerability to perturbations originating from a primary sector (e.g., air transportation) can serve as important input to risk management. For example, limited resources can be prioritized for the "top-n" sectors that are perceived to suffer the greatest economic losses due to terrorism. In addition, regional decomposition via location quotients enables the analysis of local-level terrorism events. The Regional I-O Multiplier System II (RIMS II) Division of the U.S. Department of Commerce is the agency responsible for releasing the regional multipliers for various geographical resolutions (economic areas, states, and counties). A regional-level case study demonstrates a process of estimating the economic impact of transportation-related scenarios on industry sectors within Economic Area 010 (the New York metropolitan region and vicinities).  相似文献   

8.
In risk assessment, the moment‐independent sensitivity analysis (SA) technique for reducing the model uncertainty has attracted a great deal of attention from analysts and practitioners. It aims at measuring the relative importance of an individual input, or a set of inputs, in determining the uncertainty of model output by looking at the entire distribution range of model output. In this article, along the lines of Plischke et al., we point out that the original moment‐independent SA index (also called delta index) can also be interpreted as the dependence measure between model output and input variables, and introduce another moment‐independent SA index (called extended delta index) based on copula. Then, nonparametric methods for estimating the delta and extended delta indices are proposed. Both methods need only a set of samples to compute all the indices; thus, they conquer the problem of the “curse of dimensionality.” At last, an analytical test example, a risk assessment model, and the levelE model are employed for comparing the delta and the extended delta indices and testing the two calculation methods. Results show that the delta and the extended delta indices produce the same importance ranking in these three test examples. It is also shown that these two proposed calculation methods dramatically reduce the computational burden.  相似文献   

9.
The U.S. Department of Homeland Security (DHS) has mandated all regions to "carefully weigh the benefit of each homeland security endeavor and only allocate resources where the benefit of reducing risk is worth the amount of additional cost" (DHS, 2006, p. 64). This mandate illuminates the need to develop methods for systemic valuation of preparedness measures that support strategic decision making. This article proposes an analysis method that naturally emerges from the structure of the inoperability input-output model (IIM) through which various regional- and sector-specific impact analyses can be cost-effectively integrated for natural and man-made disasters. The IIM is described extensively in a companion paper (Lian et al., 2007). Its reliance on data classifications structured by the U.S. Census Bureau and its extensive accounting of economic interdependencies enables us to decompose a risk analysis activity, perform independent assessments, and properly integrate the assessment for a systemic valuation of risk and risk management activity. In this article, we account for and assess some of the major impacts of Hurricanes Katrina and Rita to demonstrate this use of the IIM and illustrate hypothetical, reduced impacts resulting from various strategic preparedness decisions. Our results indicate the capability of the IIM to guide the decision-making processes involved in developing a preparedness strategy.  相似文献   

10.
In this article, we propose an integrated direct and indirect flood risk model for small‐ and large‐scale flood events, allowing for dynamic modeling of total economic losses from a flood event to a full economic recovery. A novel approach is taken that translates direct losses of both capital and labor into production losses using the Cobb‐Douglas production function, aiming at improved consistency in loss accounting. The recovery of the economy is modeled using a hybrid input‐output model and applied to the port region of Rotterdam, using six different flood events (1/10 up to 1/10,000). This procedure allows gaining a better insight regarding the consequences of both high‐ and low‐probability floods. The results show that in terms of expected annual damage, direct losses remain more substantial relative to the indirect losses (approximately 50% larger), but for low‐probability events the indirect losses outweigh the direct losses. Furthermore, we explored parameter uncertainty using a global sensitivity analysis, and varied critical assumptions in the modeling framework related to, among others, flood duration and labor recovery, using a scenario approach. Our findings have two important implications for disaster modelers and practitioners. First, high‐probability events are qualitatively different from low‐probability events in terms of the scale of damages and full recovery period. Second, there are substantial differences in parameter influence between high‐probability and low‐probability flood modeling. These findings suggest that a detailed approach is required when assessing the flood risk for a specific region.  相似文献   

11.
Due to the concentration of assets in disaster‐prone zones, changes in risk landscape and in the intensity of natural events, property losses have increased considerably in recent decades. While measuring these stock damages is common practice in the literature, the assessment of economic ripple effects due to business interruption is still limited and available estimates tend to vary significantly across models. This article focuses on the most popular single‐region input–output models for disaster impact evaluation. It starts with the traditional Leontief model and then compares its assumptions and results with more complex methodologies (rebalancing algorithms, the sequential interindustry model, the dynamic inoperability input–output model, and its inventory counterpart). While the estimated losses vary across models, all the figures are based on the same event, the 2007 Chehalis River flood that impacted three rural counties in Washington State. Given that the large majority of floods take place in rural areas, this article gives the practitioner a thorough review of how future events can be assessed and guidance on model selection.  相似文献   

12.
Projecting losses associated with hurricanes is a complex and difficult undertaking that is fraught with uncertainties. Hurricane Charley, which struck southwest Florida on August 13, 2004, illustrates the uncertainty of forecasting damages from these storms. Due to shifts in the track and the rapid intensification of the storm, real-time estimates grew from 2 billion dollars to 3 billion dollars in losses late on the 12th to a peak of 50 billion dollars for a brief time as the storm appeared to be headed for the Tampa Bay area. The storm struck the resort areas of Charlotte Harbor and moved across the densely populated central part of the state, with early poststorm estimates in the 28 dollars to 31 billion dollars range, and final estimates converging at 15 billion dollars as the actual intensity at landfall became apparent. The Florida Commission on Hurricane Loss Projection Methodology (FCHLPM) has a great appreciation for the role of computer models in projecting losses from hurricanes. The FCHLPM contracts with a professional team to perform onsite (confidential) audits of computer models developed by several different companies in the United States that seek to have their models approved for use in insurance rate filings in Florida. The team's members represent the fields of actuarial science, computer science, meteorology, statistics, and wind and structural engineering. An important part of the auditing process requires uncertainty and sensitivity analyses to be performed with the applicant's proprietary model. To influence future such analyses, an uncertainty and sensitivity analysis has been completed for loss projections arising from use of a sophisticated computer model based on the Holland wind field. Sensitivity analyses presented in this article utilize standardized regression coefficients to quantify the contribution of the computer input variables to the magnitude of the wind speed.  相似文献   

13.
In this article, a classification model based on the majority rule sorting (MR‐Sort) method is employed to evaluate the vulnerability of safety‐critical systems with respect to malevolent intentional acts. The model is built on the basis of a (limited‐size) set of data representing (a priori known) vulnerability classification examples. The empirical construction of the classification model introduces a source of uncertainty into the vulnerability analysis process: a quantitative assessment of the performance of the classification model (in terms of accuracy and confidence in the assignments) is thus in order. Three different app oaches are here considered to this aim: (i) a model–retrieval‐based approach, (ii) the bootstrap method, and (iii) the leave‐one‐out cross‐validation technique. The analyses are presented with reference to an exemplificative case study involving the vulnerability assessment of nuclear power plants.  相似文献   

14.
This article introduces approaches for identifying key interdependent infrastructure sectors based on the inventory dynamic inoperability input‐output model, which integrates an inventory model and a risk‐based interdependency model. An identification of such key sectors narrows a policymaker's focus on sectors providing most impact and receiving most impact from inventory‐caused delays in inoperability resulting from disruptive events. A case study illustrates the practical insights of the key sector approaches derived from a value of workforce‐centered production inoperability from Bureau of Economic Analysis data.  相似文献   

15.
Jan F. Van Impe 《Risk analysis》2011,31(8):1295-1307
The aim of quantitative microbiological risk assessment is to estimate the risk of illness caused by the presence of a pathogen in a food type, and to study the impact of interventions. Because of inherent variability and uncertainty, risk assessments are generally conducted stochastically, and if possible it is advised to characterize variability separately from uncertainty. Sensitivity analysis allows to indicate to which of the input variables the outcome of a quantitative microbiological risk assessment is most sensitive. Although a number of methods exist to apply sensitivity analysis to a risk assessment with probabilistic input variables (such as contamination, storage temperature, storage duration, etc.), it is challenging to perform sensitivity analysis in the case where a risk assessment includes a separate characterization of variability and uncertainty of input variables. A procedure is proposed that focuses on the relation between risk estimates obtained by Monte Carlo simulation and the location of pseudo‐randomly sampled input variables within the uncertainty and variability distributions. Within this procedure, two methods are used—that is, an ANOVA‐like model and Sobol sensitivity indices—to obtain and compare the impact of variability and of uncertainty of all input variables, and of model uncertainty and scenario uncertainty. As a case study, this methodology is applied to a risk assessment to estimate the risk of contracting listeriosis due to consumption of deli meats.  相似文献   

16.
Groundwater leakage into subsurface constructions can cause reduction of pore pressure and subsidence in clay deposits, even at large distances from the location of the construction. The potential cost of damage is substantial, particularly in urban areas. The large‐scale process also implies heterogeneous soil conditions that cannot be described in complete detail, which causes a need for estimating uncertainty of subsidence with probabilistic methods. In this study, the risk for subsidence is estimated by coupling two probabilistic models, a geostatistics‐based soil stratification model with a subsidence model. Statistical analyses of stratification and soil properties are inputs into the models. The results include spatially explicit probabilistic estimates of subsidence magnitude and sensitivities of included model parameters. From these, areas with significant risk for subsidence are distinguished from low‐risk areas. The efficiency and usefulness of this modeling approach as a tool for communication to stakeholders, decision support for prioritization of risk‐reducing measures, and identification of the need for further investigations and monitoring are demonstrated with a case study of a planned tunnel in Stockholm.  相似文献   

17.
This article demonstrates application of sensitivity analysis to risk assessment models with two-dimensional probabilistic frameworks that distinguish between variability and uncertainty. A microbial food safety process risk (MFSPR) model is used as a test bed. The process of identifying key controllable inputs and key sources of uncertainty using sensitivity analysis is challenged by typical characteristics of MFSPR models such as nonlinearity, thresholds, interactions, and categorical inputs. Among many available sensitivity analysis methods, analysis of variance (ANOVA) is evaluated in comparison to commonly used methods based on correlation coefficients. In a two-dimensional risk model, the identification of key controllable inputs that can be prioritized with respect to risk management is confounded by uncertainty. However, as shown here, ANOVA provided robust insights regarding controllable inputs most likely to lead to effective risk reduction despite uncertainty. ANOVA appropriately selected the top six important inputs, while correlation-based methods provided misleading insights. Bootstrap simulation is used to quantify uncertainty in ranks of inputs due to sampling error. For the selected sample size, differences in F values of 60% or more were associated with clear differences in rank order between inputs. Sensitivity analysis results identified inputs related to the storage of ground beef servings at home as the most important. Risk management recommendations are suggested in the form of a consumer advisory for better handling and storage practices.  相似文献   

18.
M. C. Kennedy 《Risk analysis》2011,31(10):1597-1609
Two‐dimensional Monte Carlo simulation is frequently used to implement probabilistic risk models, as it allows for uncertainty and variability to be quantified separately. In many cases, we are interested in the proportion of individuals from a variable population exceeding a critical threshold, together with uncertainty about this proportion. In this article we introduce a new method that can accurately estimate these quantities much more efficiently than conventional algorithms. We also show how those model parameters having the greatest impact on the probabilities of rare events can be quickly identified via this method. The algorithm combines elements from well‐established statistical techniques in extreme value theory and Bayesian analysis of computer models. We demonstrate the practical application of these methods with a simple example, in which the true distributions are known exactly, and also with a more realistic model of microbial contamination of milk with seven parameters. For the latter, sensitivity analysis (SA) is shown to identify the two inputs explaining the majority of variation in distribution tail behavior. In the subsequent prediction of probabilities of large contamination events, similar results are obtained using the new approach taking 43 seconds or the conventional simulation that requires more than 3 days.  相似文献   

19.
Estimates of the cost of potential disasters, including indirect economic consequences, are an important input in the design of risk management strategies. The adaptive regional input‐output (ARIO) inventory model is a tool to assess indirect disaster losses and to analyze their drivers. It is based on an input‐output structure, but it also (i) explicitly represents production bottlenecks and input scarcity and (ii) introduces inventories as an additional flexibility in the production system. This modeling strategy distinguishes between (i) essential supplies that cannot be stocked (e.g., electricity, water) and whose scarcity can paralyze all economic activity; (ii) essential supplies that can be stocked at least temporarily (e.g., steel, chemicals), whose scarcity creates problems only over the medium term; and (iii) supplies that are not essential in the production process, whose scarcity is problematic only over the long run and are therefore easy to replace with imports. The model is applied to the landfall of Hurricane Katrina in Louisiana and identifies two periods in the disaster aftermath: (1) the first year, during which production bottlenecks are responsible for large output losses; (2) the rest of the reconstruction period, during which bottlenecks are inexistent and output losses lower. This analysis also suggests important research questions and policy options to mitigate disaster‐related output losses.  相似文献   

20.
Introduction of classical swine fever virus (CSFV) is a continuing threat to the pig production sector in the European Union. A scenario tree model was developed to obtain more insight into the main risk factors determining the probability of CSFV introduction (P(CSFV)). As this model contains many uncertain input parameters, sensitivity analysis was used to indicate which of these parameters influence model results most. Group screening combined with the statistical techniques of design of experiments and meta-modeling was applied to detect the most important uncertain input parameters among a total of 257 parameters. The response variable chosen was the annual P(CSFV) into the Netherlands. Only 128 scenario calculations were needed to specify the final meta-model. A consecutive one-at-a-time sensitivity analysis was performed with the main effects of this meta-model to explore their impact on the ranking of risk factors contributing most to the annual P(CSFV). The results indicated that model outcome is most sensitive to the uncertain input parameters concerning the expected number of classical swine fever epidemics in Germany, Belgium, and the United Kingdom and the probability that CSFV survives in an empty livestock truck traveling over a distance of 0-900 km.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号