首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
Outbreaks of contagious diseases underscore the ever‐looming threat of new epidemics. Compared to other disasters that inflict physical damage to infrastructure systems, epidemics can have more devastating and prolonged impacts on the population. This article investigates the interdependent economic and productivity risks resulting from epidemic‐induced workforce absenteeism. In particular, we develop a dynamic input‐output model capable of generating sector‐disaggregated economic losses based on different magnitudes of workforce disruptions. An ex post analysis of the 2009 H1N1 pandemic in the national capital region (NCR) reveals the distribution of consequences across different economic sectors. Consequences are categorized into two metrics: (i) economic loss, which measures the magnitude of monetary losses incurred in each sector, and (ii) inoperability, which measures the normalized monetary losses incurred in each sector relative to the total economic output of that sector. For a simulated mild pandemic scenario in NCR, two distinct rankings are generated using the economic loss and inoperability metrics. Results indicate that the majority of the critical sectors ranked according to the economic loss metric comprise of sectors that contribute the most to the NCR's gross domestic product (e.g., federal government enterprises). In contrast, the majority of the critical sectors generated by the inoperability metric include sectors that are involved with epidemic management (e.g., hospitals). Hence, prioritizing sectors for recovery necessitates consideration of the balance between economic loss, inoperability, and other objectives. Although applied specifically to the NCR, the proposed methodology can be customized for other regions.  相似文献   

2.
Input‐output analysis is frequently used in studies of large‐scale weather‐related (e.g., Hurricanes and flooding) disruption of a regional economy. The economy after a sudden catastrophe shows a multitude of imbalances with respect to demand and production and may take months or years to recover. However, there is no consensus about how the economy recovers. This article presents a theoretical route map for imbalanced economic recovery called dynamic inequalities. Subsequently, it is applied to a hypothetical postdisaster economic scenario of flooding in London around the year 2020 to assess the influence of future shocks to a regional economy and suggest adaptation measures. Economic projections are produced by a macro econometric model and used as baseline conditions. The results suggest that London's economy would recover over approximately 70 months by applying a proportional rationing scheme under the assumption of initial 50% labor loss (with full recovery in six months), 40% initial loss to service sectors, and 10–30% initial loss to other sectors. The results also suggest that imbalance will be the norm during the postdisaster period of economic recovery even though balance may occur temporarily. Model sensitivity analysis suggests that a proportional rationing scheme may be an effective strategy to apply during postdisaster economic reconstruction, and that policies in transportation recovery and in health care are essential for effective postdisaster economic recovery.  相似文献   

3.
Disruptive events such as natural disasters, loss or reduction of resources, work stoppages, and emergent conditions have potential to propagate economic losses across trade networks. In particular, disruptions to the operation of container port activity can be detrimental for international trade and commerce. Risk assessment should anticipate the impact of port operation disruptions with consideration of how priorities change due to uncertain scenarios and guide investments that are effective and feasible for implementation. Priorities for protective measures and continuity of operations planning must consider the economic impact of such disruptions across a variety of scenarios. This article introduces new performance metrics to characterize resiliency in interdependency modeling and also integrates scenario‐based methods to measure economic sensitivity to sudden‐onset disruptions. The methods will be demonstrated on a U.S. port responsible for handling $36.1 billion of cargo annually. The methods will be useful to port management, private industry supply chain planning, and transportation infrastructure management.  相似文献   

4.
Marco Percoco 《Risk analysis》2011,31(7):1038-1042
Natural and man‐made disasters are currently a source of major concern for contemporary societies. In order to understand their economic impacts, the inoperability input‐output model has recently gained recognition among scholars. In a recent paper, Percoco (2006) has proposed an extension of the model to map the technologically most important sectors through so‐called fields of influence. In the present note we aim to show that this importance measure also has a clear connection with local sensitivity analysis theory.  相似文献   

5.
This article introduces approaches for identifying key interdependent infrastructure sectors based on the inventory dynamic inoperability input‐output model, which integrates an inventory model and a risk‐based interdependency model. An identification of such key sectors narrows a policymaker's focus on sectors providing most impact and receiving most impact from inventory‐caused delays in inoperability resulting from disruptive events. A case study illustrates the practical insights of the key sector approaches derived from a value of workforce‐centered production inoperability from Bureau of Economic Analysis data.  相似文献   

6.
In this article, we propose an integrated direct and indirect flood risk model for small‐ and large‐scale flood events, allowing for dynamic modeling of total economic losses from a flood event to a full economic recovery. A novel approach is taken that translates direct losses of both capital and labor into production losses using the Cobb‐Douglas production function, aiming at improved consistency in loss accounting. The recovery of the economy is modeled using a hybrid input‐output model and applied to the port region of Rotterdam, using six different flood events (1/10 up to 1/10,000). This procedure allows gaining a better insight regarding the consequences of both high‐ and low‐probability floods. The results show that in terms of expected annual damage, direct losses remain more substantial relative to the indirect losses (approximately 50% larger), but for low‐probability events the indirect losses outweigh the direct losses. Furthermore, we explored parameter uncertainty using a global sensitivity analysis, and varied critical assumptions in the modeling framework related to, among others, flood duration and labor recovery, using a scenario approach. Our findings have two important implications for disaster modelers and practitioners. First, high‐probability events are qualitatively different from low‐probability events in terms of the scale of damages and full recovery period. Second, there are substantial differences in parameter influence between high‐probability and low‐probability flood modeling. These findings suggest that a detailed approach is required when assessing the flood risk for a specific region.  相似文献   

7.
Knowledge on failure events and their associated factors, gained from past construction projects, is regarded as potentially extremely useful in risk management. However, a number of circumstances are constraining its wider use. Such knowledge is usually scarce, seldom documented, and even unavailable when it is required. Further, there exists a lack of proven methods to integrate and analyze it in a cost‐effective way. This article addresses possible options to overcome these difficulties. Focusing on limited but critical potential failure events, the article demonstrates how knowledge on a number of important potential failure events in tunnel works can be integrated. The problem of unavailable or incomplete information was addressed by gathering judgments from a group of experts. The elicited expert knowledge consisted of failure scenarios and associated probabilistic information. This information was integrated using Bayesian belief‐networks‐based models that were first customized in order to deal with the expected divergence in judgments caused by epistemic uncertainty of risks. The work described in the article shows that the developed models that integrate risk‐related knowledge provide guidance as to the use of specific remedial measures.  相似文献   

8.
The authors of this article have developed six probabilistic causal models for critical risks in tunnel works. The details of the models' development and evaluation were reported in two earlier publications of this journal. Accordingly, as a remaining step, this article is focused on the investigation into the use of these models in a real case study project. The use of the models is challenging given the need to provide information on risks that usually are both project and context dependent. The latter is of particular concern in underground construction projects. Tunnel risks are the consequences of interactions between site‐ and project‐ specific factors. Large variations and uncertainties in ground conditions as well as project singularities give rise to particular risk factors with very specific impacts. These circumstances mean that existing risk information, gathered from previous projects, is extremely difficult to use in other projects. This article considers these issues and addresses the extent to which prior risk‐related knowledge, in the form of causal models, as the models developed for the investigation, can be used to provide useful risk information for the case study project. The identification and characterization of the causes and conditions that lead to failures and their interactions as well as their associated probabilistic information is assumed to be risk‐related knowledge in this article. It is shown that, irrespective of existing constraints on using information and knowledge from past experiences, construction risk‐related knowledge can be transferred and used from project to project in the form of comprehensive models based on probabilistic‐causal relationships. The article also shows that the developed models provide guidance as to the use of specific remedial measures by means of the identification of critical risk factors, and therefore they support risk management decisions. Similarly, a number of limitations of the models are discussed.  相似文献   

9.
Access management, which systematically limits opportunities for egress and ingress of vehicles to highway lanes, is critical to protect trillions of dollars of current investment in transportation. This article addresses allocating resources for access management with incomplete and partially relevant data on crash rates, travel speeds, and other factors. While access management can be effective to avoid crashes, reduce travel times, and increase route capacities, the literature suggests a need for performance metrics to guide investments in resource allocation across large corridor networks and several time horizons. In this article, we describe a quantitative decision model to support an access management program via risk‐cost‐benefit analysis under data uncertainties from diverse sources of data and expertise. The approach quantifies potential benefits, including safety improvement and travel time savings, and costs of access management through functional relationships of input parameters including crash rates, corridor access point densities, and traffic volumes. Parameter uncertainties, which vary across locales and experts, are addressed via numerical interval analyses. This approach is demonstrated at several geographic scales across 7,000 kilometers of highways in a geographic region and several subregions. The demonstration prioritizes route segments that would benefit from risk management, including (i) additional data or elicitation, (ii) right‐of‐way purchases, (iii) restriction or closing of access points, (iv) new alignments, (v) developer proffers, and (vi) etc. The approach ought to be of wide interest to analysts, planners, policymakers, and stakeholders who rely on heterogeneous data and expertise for risk management.  相似文献   

10.
In this article, we introduce a framework for analyzing the risk of systems failure based on estimating the failure probability. The latter is defined as the probability that a certain risk process, characterizing the operations of a system, reaches a possibly time‐dependent critical risk level within a finite‐time interval. Under general assumptions, we define two dually connected models for the risk process and derive explicit expressions for the failure probability and also the joint probability of the time of the occurrence of failure and the excess of the risk process over the risk level. We illustrate how these probabilistic models and results can be successfully applied in several important areas of risk analysis, among which are systems reliability, inventory management, flood control via dam management, infectious disease spread, and financial insolvency. Numerical illustrations are also presented.  相似文献   

11.
How can risk analysts help to improve policy and decision making when the correct probabilistic relation between alternative acts and their probable consequences is unknown? This practical challenge of risk management with model uncertainty arises in problems from preparing for climate change to managing emerging diseases to operating complex and hazardous facilities safely. We review constructive methods for robust and adaptive risk analysis under deep uncertainty. These methods are not yet as familiar to many risk analysts as older statistical and model‐based methods, such as the paradigm of identifying a single “best‐fitting” model and performing sensitivity analyses for its conclusions. They provide genuine breakthroughs for improving predictions and decisions when the correct model is highly uncertain. We demonstrate their potential by summarizing a variety of practical risk management applications.  相似文献   

12.
Expert knowledge is an important source of input to risk analysis. In practice, experts might be reluctant to characterize their knowledge and the related (epistemic) uncertainty using precise probabilities. The theory of possibility allows for imprecision in probability assignments. The associated possibilistic representation of epistemic uncertainty can be combined with, and transformed into, a probabilistic representation; in this article, we show this with reference to a simple fault tree analysis. We apply an integrated (hybrid) probabilistic‐possibilistic computational framework for the joint propagation of the epistemic uncertainty on the values of the (limiting relative frequency) probabilities of the basic events of the fault tree, and we use possibility‐probability (probability‐possibility) transformations for propagating the epistemic uncertainty within purely probabilistic and possibilistic settings. The results of the different approaches (hybrid, probabilistic, and possibilistic) are compared with respect to the representation of uncertainty about the top event (limiting relative frequency) probability. Both the rationale underpinning the approaches and the computational efforts they require are critically examined. We conclude that the approaches relevant in a given setting depend on the purpose of the risk analysis, and that further research is required to make the possibilistic approaches operational in a risk analysis context.  相似文献   

13.
《Risk analysis》2018,38(8):1585-1600
Historical data analysis shows that escalation accidents, so‐called domino effects, have an important role in disastrous accidents in the chemical and process industries. In this study, an agent‐based modeling and simulation approach is proposed to study the propagation of domino effects in the chemical and process industries. Different from the analytical or Monte Carlo simulation approaches, which normally study the domino effect at probabilistic network levels, the agent‐based modeling technique explains the domino effects from a bottom‐up perspective. In this approach, the installations involved in a domino effect are modeled as agents whereas the interactions among the installations (e.g., by means of heat radiation) are modeled via the basic rules of the agents. Application of the developed model to several case studies demonstrates the ability of the model not only in modeling higher‐level domino effects and synergistic effects but also in accounting for temporal dependencies. The model can readily be applied to large‐scale complicated cases.  相似文献   

14.
In risk assessment, the moment‐independent sensitivity analysis (SA) technique for reducing the model uncertainty has attracted a great deal of attention from analysts and practitioners. It aims at measuring the relative importance of an individual input, or a set of inputs, in determining the uncertainty of model output by looking at the entire distribution range of model output. In this article, along the lines of Plischke et al., we point out that the original moment‐independent SA index (also called delta index) can also be interpreted as the dependence measure between model output and input variables, and introduce another moment‐independent SA index (called extended delta index) based on copula. Then, nonparametric methods for estimating the delta and extended delta indices are proposed. Both methods need only a set of samples to compute all the indices; thus, they conquer the problem of the “curse of dimensionality.” At last, an analytical test example, a risk assessment model, and the levelE model are employed for comparing the delta and the extended delta indices and testing the two calculation methods. Results show that the delta and the extended delta indices produce the same importance ranking in these three test examples. It is also shown that these two proposed calculation methods dramatically reduce the computational burden.  相似文献   

15.
A Monte Carlo simulation is incorporated into a risk assessment for trichloroethylene (TCE) using physiologically-based pharmacokinetic (PBPK) modeling coupled with the linearized multistage model to derive human carcinogenic risk extrapolations. The Monte Carlo technique incorporates physiological parameter variability to produce a statistically derived range of risk estimates which quantifies specific uncertainties associated with PBPK risk assessment approaches. Both inhalation and ingestion exposure routes are addressed. Simulated exposure scenarios were consistent with those used by the Environmental Protection Agency (EPA) in their TCE risk assessment. Mean values of physiological parameters were gathered from the literature for both mice (carcinogenic bioassay subjects) and for humans. Realistic physiological value distributions were assumed using existing data on variability. Mouse cancer bioassay data were correlated to total TCE metabolized and area-under-the-curve (blood concentration) trichloroacetic acid (TCA) as determined by a mouse PBPK model. These internal dose metrics were used in a linearized multistage model analysis to determine dose metric values corresponding to 10-6 lifetime excess cancer risk. Using a human PBPK model, these metabolized doses were then extrapolated to equivalent human exposures (inhalation and ingestion). The Monte Carlo iterations with varying mouse and human physiological parameters produced a range of human exposure concentrations producing a 10-6 risk.  相似文献   

16.
Model uncertainty is a primary source of uncertainty in the assessment of the performance of repositories for the disposal of nuclear wastes, due to the complexity of the system and the large spatial and temporal scales involved. This work considers multiple assumptions on the system behavior and corresponding alternative plausible modeling hypotheses. To characterize the uncertainty in the correctness of the different hypotheses, the opinions of different experts are treated probabilistically or, in alternative, by the belief and plausibility functions of the Dempster‐Shafer theory. A comparison is made with reference to a flow model for the evaluation of the hydraulic head distributions present at a radioactive waste repository site. Three experts are assumed available for the evaluation of the uncertainties associated with the hydrogeological properties of the repository and the groundwater flow mechanisms.  相似文献   

17.
Mean‐deviation analysis, along with the existing theories of coherent risk measures and dual utility, is examined in the context of the theory of choice under uncertainty, which studies rational preference relations for random outcomes based on different sets of axioms such as transitivity, monotonicity, continuity, etc. An axiomatic foundation of the theory of coherent risk measures is obtained as a relaxation of the axioms of the dual utility theory, and a further relaxation of the axioms are shown to lead to the mean‐deviation analysis. Paradoxes arising from the sets of axioms corresponding to these theories and their possible resolutions are discussed, and application of the mean‐deviation analysis to optimal risk sharing and portfolio selection in the context of rational choice is considered.  相似文献   

18.
The cost‐benefit evaluation of passive fire protection adoption in the road transport of liquefied petroleum gas (LPG) was investigated. In a previous study, mathematical simulations of real scale fire scenarios proved the effectiveness of passive fire protections in preventing the “fired” boiling liquid expanding vapor explosion (BLEVE), thus providing a significant risk reduction. In the present study the economical aspects of the adoption of fire protections are analyzed and an approach to cost‐benefit analysis (CBA) is proposed. The CBA model is based on the comparison of the risk reduction due to fire protections (expressed in monetary terms by the value of a statistical life) and the cost of the application of fire protections to a fleet of tankers. Different types of fire protections were considered, as well as the possibility to apply protections to the entire fleet or only to a part of it. The application of the proposed model to a real‐life case study is presented and discussed. Results demonstrate that the adoption of passive fire protections on road tankers, though not compulsory in Europe, can be economically feasible, thus representing a concrete measure to achieve control of the “major hazard accidents” cited by the European legislation.  相似文献   

19.
We examine the role of knowledge diversity among unit members on an organizational unit's productivity. Utilizing a proprietary data set of corrective maintenance tasks from a large software‐services firm, we investigate the impact of two key within‐unit diversity metrics: interpersonal diversity and intrapersonal diversity. We analyze the independent influence of interpersonal diversity and the interactive influence of interpersonal diversity and intrapersonal diversity on organizational unit's productivity. Finally, we examine how diversity moderates productivity of an organizational unit when employee turnover occurs. Our analysis reveals the following key insights: (a) interpersonal diversity has an inverted U‐shaped effect on organizational unit's productivity; (b) intrapersonal diversity moderates the influence of interpersonal diversity on organizational‐unit productivity; (c) at higher levels of interpersonal diversity, rate of decrease in productivity of the organizational unit due to turnover is higher. We discuss the resulting theoretical and managerial insights associated with these findings.  相似文献   

20.
According to Codex Alimentarius Commission recommendations, management options applied at the process production level should be based on good hygiene practices, HACCP system, and new risk management metrics such as the food safety objective. To follow this last recommendation, the use of quantitative microbiological risk assessment is an appealing approach to link new risk‐based metrics to management options that may be applied by food operators. Through a specific case study, Listeria monocytogenes in soft cheese made from pasteurized milk, the objective of the present article is to practically show how quantitative risk assessment could be used to direct potential intervention strategies at different food processing steps. Based on many assumptions, the model developed estimates the risk of listeriosis at the moment of consumption taking into account the entire manufacturing process and potential sources of contamination. From pasteurization to consumption, the amplification of a primo‐contamination event of the milk, the fresh cheese or the process environment is simulated, over time, space, and between products, accounting for the impact of management options, such as hygienic operations and sampling plans. A sensitivity analysis of the model will help orientating data to be collected prioritarily for the improvement and the validation of the model. What‐if scenarios were simulated and allowed for the identification of major parameters contributing to the risk of listeriosis and the optimization of preventive and corrective measures.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号