首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 718 毫秒
1.
Dynamic reliability methods aim at complementing the capability of traditional static approaches (e.g., event trees [ETs] and fault trees [FTs]) by accounting for the system dynamic behavior and its interactions with the system state transition process. For this, the system dynamics is here described by a time‐dependent model that includes the dependencies with the stochastic transition events. In this article, we present a novel computational framework for dynamic reliability analysis whose objectives are i) accounting for discrete stochastic transition events and ii) identifying the prime implicants (PIs) of the dynamic system. The framework entails adopting a multiple‐valued logic (MVL) to consider stochastic transitions at discretized times. Then, PIs are originally identified by a differential evolution (DE) algorithm that looks for the optimal MVL solution of a covering problem formulated for MVL accident scenarios. For testing the feasibility of the framework, a dynamic noncoherent system composed of five components that can fail at discretized times has been analyzed, showing the applicability of the framework to practical cases.  相似文献   

2.
We present a comprehensive framework for Bayesian estimation of structural nonlinear dynamic economic models on sparse grids to overcome the curse of dimensionality for approximations. We apply sparse grids to a global polynomial approximation of the model solution, to the quadrature of integrals arising as rational expectations, and to three new nonlinear state space filters which speed up the sequential importance resampling particle filter. The posterior of the structural parameters is estimated by a new Metropolis–Hastings algorithm with mixing parallel sequences. The parallel extension improves the global maximization property of the algorithm, simplifies the parameterization for an appropriate acceptance ratio, and allows a simple implementation of the estimation on parallel computers. Finally, we provide all algorithms in the open source software JBendge for the solution and estimation of a general class of models.  相似文献   

3.
Multicriteria decision analysis (MCDA) has been applied to various energy problems to incorporate a variety of qualitative and quantitative criteria, usually spanning environmental, social, engineering, and economic fields. MCDA and associated methods such as life‐cycle assessments and cost‐benefit analysis can also include risk analysis to address uncertainties in criteria estimates. One technology now being assessed to help mitigate climate change is carbon capture and storage (CCS). CCS is a new process that captures CO2 emissions from fossil‐fueled power plants and injects them into geological reservoirs for storage. It presents a unique challenge to decisionmakers (DMs) due to its technical complexity, range of environmental, social, and economic impacts, variety of stakeholders, and long time spans. The authors have developed a risk assessment model using a MCDA approach for CCS decisions such as selecting between CO2 storage locations and choosing among different mitigation actions for reducing risks. The model includes uncertainty measures for several factors, utility curve representations of all variables, Monte Carlo simulation, and sensitivity analysis. This article uses a CCS scenario example to demonstrate the development and application of the model based on data derived from published articles and publicly available sources. The model allows high‐level DMs to better understand project risks and the tradeoffs inherent in modern, complex energy decisions.  相似文献   

4.
Following the 2013 Chelyabinsk event, the risks posed by asteroids attracted renewed interest, from both the scientific and policy‐making communities. It reminded the world that impacts from near‐Earth objects (NEOs), while rare, have the potential to cause great damage to cities and populations. Point estimates of the risk (such as mean numbers of casualties) have been proposed, but because of the low‐probability, high‐consequence nature of asteroid impacts, these averages provide limited actionable information. While more work is needed to further refine its input distributions (e.g., NEO diameters), the probabilistic model presented in this article allows a more complete evaluation of the risk of NEO impacts because the results are distributions that cover the range of potential casualties. This model is based on a modularized simulation that uses probabilistic inputs to estimate probabilistic risk metrics, including those of rare asteroid impacts. Illustrative results of this analysis are presented for a period of 100 years. As part of this demonstration, we assess the effectiveness of civil defense measures in mitigating the risk of human casualties. We find that they are likely to be beneficial but not a panacea. We also compute the probability—but not the consequences—of an impact with global effects (“cataclysm”). We conclude that there is a continued need for NEO observation, and for analyses of the feasibility and risk‐reduction effectiveness of space missions designed to deflect or destroy asteroids that threaten the Earth.  相似文献   

5.
Low‐earth orbit satellite (LEO) systems continue to provide mobile communication services. The issue of cost containment in system maintenance is a critical factor for continued operation. Satellite finite life‐times follow a stochastic process, and since satellite replenishment cost is the most significant on‐going cost of operation, finding optimal launch policies is of paramount importance. This paper formulates the satellite launch problem as a Markovian decision model that can be solved using dynamic programming. The policy space of the system is enormous and traditional action space dominance rules do not apply. In order to solve the dynamic program for realistic problem sizes, a novel procedure for limiting the state space considered in the dynamic program is developed. The viability of the proposed solution procedure is demonstrated in example problems using realistic system data. The policies derived by the proposed solution procedure are superior to those currently considered by LEO system operators, and result in substantial annual cost savings.  相似文献   

6.
Floods are a natural hazard evolving in space and time according to meteorological and river basin dynamics, so that a single flood event can affect different regions over the event duration. This physical mechanism introduces spatio‐temporal relationships between flood records and losses at different locations over a given time window that should be taken into account for an effective assessment of the collective flood risk. However, since extreme floods are rare events, the limited number of historical records usually prevents a reliable frequency analysis. To overcome this limit, we move from the analysis of extreme events to the modeling of continuous stream flow records preserving spatio‐temporal correlation structures of the entire process, and making a more efficient use of the information provided by continuous flow records. The approach is based on the dynamic copula framework, which allows for splitting the modeling of spatio‐temporal properties by coupling suitable time series models accounting for temporal dynamics, and multivariate distributions describing spatial dependence. The model is applied to 490 stream flow sequences recorded across 10 of the largest river basins in central and eastern Europe (Danube, Rhine, Elbe, Oder, Waser, Meuse, Rhone, Seine, Loire, and Garonne). Using available proxy data to quantify local flood exposure and vulnerability, we show that the temporal dependence exerts a key role in reproducing interannual persistence, and thus magnitude and frequency of annual proxy flood losses aggregated at a basin‐wide scale, while copulas allow the preservation of the spatial dependence of losses at weekly and annual time scales.  相似文献   

7.
Simulation is a powerful tool for modeling complex systems with intricate relationships between various entities and resources. Simulation optimization refers to methods that search the design space (i.e., the set of all feasible system configurations) to find a system configuration (also called a design point) that gives the best performance. Since simulation is often time consuming, sampling as few design points from the design space as possible is desired. However, in the case of multiple objectives, traditional simulation optimization methods are ineffective to uncover the efficient frontier. We propose a framework for multi-objective simulation optimization that combines the power of genetic algorithm (GA), which can effectively search very large design spaces, with data envelopment analysis (DEA) used to evaluate the simulation results and guide the search process. In our framework, we use a design point's relative efficiency score from DEA as its fitness value in the selection operation of GA. We apply our algorithm to determine optimal resource levels in surgical services. Our numerical experiments show that our algorithm effectively furthers the frontier and identifies efficient design points.  相似文献   

8.
In this article, we propose an integrated direct and indirect flood risk model for small‐ and large‐scale flood events, allowing for dynamic modeling of total economic losses from a flood event to a full economic recovery. A novel approach is taken that translates direct losses of both capital and labor into production losses using the Cobb‐Douglas production function, aiming at improved consistency in loss accounting. The recovery of the economy is modeled using a hybrid input‐output model and applied to the port region of Rotterdam, using six different flood events (1/10 up to 1/10,000). This procedure allows gaining a better insight regarding the consequences of both high‐ and low‐probability floods. The results show that in terms of expected annual damage, direct losses remain more substantial relative to the indirect losses (approximately 50% larger), but for low‐probability events the indirect losses outweigh the direct losses. Furthermore, we explored parameter uncertainty using a global sensitivity analysis, and varied critical assumptions in the modeling framework related to, among others, flood duration and labor recovery, using a scenario approach. Our findings have two important implications for disaster modelers and practitioners. First, high‐probability events are qualitatively different from low‐probability events in terms of the scale of damages and full recovery period. Second, there are substantial differences in parameter influence between high‐probability and low‐probability flood modeling. These findings suggest that a detailed approach is required when assessing the flood risk for a specific region.  相似文献   

9.
This article introduces a new integrated scenario-based evacuation (ISE) framework to support hurricane evacuation decision making. It explicitly captures the dynamics, uncertainty, and human–natural system interactions that are fundamental to the challenge of hurricane evacuation, but have not been fully captured in previous formal evacuation models. The hazard is represented with an ensemble of probabilistic scenarios, population behavior with a dynamic decision model, and traffic with a dynamic user equilibrium model. The components are integrated in a multistage stochastic programming model that minimizes risk and travel times to provide a tree of evacuation order recommendations and an evaluation of the risk and travel time performance for that solution. The ISE framework recommendations offer an advance in the state of the art because they: (1) are based on an integrated hazard assessment (designed to ultimately include inland flooding), (2) explicitly balance the sometimes competing objectives of minimizing risk and minimizing travel time, (3) offer a well-hedged solution that is robust under the range of ways the hurricane might evolve, and (4) leverage the substantial value of increasing information (or decreasing degree of uncertainty) over the course of a hurricane event. A case study for Hurricane Isabel (2003) in eastern North Carolina is presented to demonstrate how the framework is applied, the type of results it can provide, and how it compares to available methods of a single scenario deterministic analysis and a two-stage stochastic program.  相似文献   

10.
In this study, we propose a methodological framework to provide a road map to clinicians and system planners in developing chronic disease management strategies, and designing community‐based care. We extend the analytical epidemiologic model by utilizing a patient flow approach, in order to model the multiple care‐provider visit patterns of patients with a specific chronic illness. The patterns of care received by a group of patients are represented in compact form by means of a Markov model that is based on a disease‐specific state space. Our framework also reflects the case‐mix biases as well as the care‐provider level clustering of the patients. By using this approach, we identify the patterns of care, determine the care provider and patient characteristics associated with optimal management of care, and estimate the potential influence of various interventions. The framework is applied to the data of 4000+ stroke patients discharged from the acute care hospitals of Quebec to their homes. Our findings provide a basis for designing community‐based care initiatives for stroke survivors in the province.  相似文献   

11.
This article is based on a quantitative risk assessment (QRA) that was performed on a radioactive waste disposal area within the Western New York Nuclear Service Center in western New York State. The QRA results were instrumental in the decision by the New York State Energy Research and Development Authority to support a strategy of in‐place management of the disposal area for another decade. The QRA methodology adopted for this first of a kind application was a scenario‐based approach in the framework of the triplet definition of risk (scenarios, likelihoods, consequences). The measure of risk is the frequency of occurrence of different levels of radiation dose to humans at prescribed locations. The risk from each scenario is determined by (1) the frequency of disruptive events or natural processes that cause a release of radioactive materials from the disposal area; (2) the physical form, quantity, and radionuclide content of the material that is released during each scenario; (3) distribution, dilution, and deposition of the released materials throughout the environment surrounding the disposal area; and (4) public exposure to the distributed material and the accumulated radiation dose from that exposure. The risks of the individual scenarios are assembled into a representation of the risk from the disposal area. In addition to quantifying the total risk to the public, the analysis ranks the importance of each contributing scenario, which facilitates taking corrective actions and implementing effective risk management. Perhaps most importantly, quantification of the uncertainties is an intrinsic part of the risk results. This approach to safety analysis has demonstrated many advantages of applying QRA principles to assessing the risk of facilities involving hazardous materials.  相似文献   

12.
In the event of contamination of a water distribution system, decisions must be made to mitigate the impact of the contamination and to protect public health. Making threat management decisions while a contaminant spreads through the network is a dynamic and interactive process. Response actions taken by the utility managers and water consumption choices made by the consumers will affect the hydraulics, and thus the spread of the contaminant plume, in the network. A modeling framework that allows the simulation of a contamination event under the effects of actions taken by utility managers and consumers will be a useful tool for the analysis of alternative threat mitigation and management strategies. This article presents a multiagent modeling framework that combines agent‐based, mechanistic, and dynamic methods. Agents select actions based on a set of rules that represent an individual's autonomy, goal‐based desires, and reaction to the environment and the actions of other agents. Consumer behaviors including ingestion, mobility, reduction of water demands, and word‐of‐mouth communication are simulated. Management strategies are evaluated, including opening hydrants to flush the contaminant and broadcasts. As actions taken by consumer agents and utility operators affect demands and flows in the system, the mechanistic model is updated. Management strategies are evaluated based on the exposure of the population to the contaminant. The framework is designed to consider the typical issues involved in water distribution threat management and provides valuable analysis of threat containment strategies for water distribution system contamination events.  相似文献   

13.
《Risk analysis》2018,38(6):1279-1305
Modern infrastructures are becoming increasingly dependent on electronic systems, leaving them more vulnerable to electrical surges or electromagnetic interference. Electromagnetic disturbances appear in nature, e.g., lightning and solar wind; however, they may also be generated by man‐made technology to maliciously damage or disturb electronic equipment. This article presents a systematic risk assessment framework for identifying possible, consequential, and plausible intentional electromagnetic interference (IEMI) attacks on an arbitrary distribution network infrastructure. In the absence of available data on IEMI occurrences, we find that a systems‐based risk assessment is more useful than a probabilistic approach. We therefore modify the often applied definition of risk, i.e., a set of triplets containing scenario, probability, and consequence, to a set of quadruplets: scenario, resource requirements, plausibility, and consequence. Probability is “replaced” by resource requirements and plausibility, where the former is the minimum amount and type of equipment necessary to successfully carry out an attack scenario and the latter is a subjective assessment of the extent of the existence of attackers who possess the motivation, knowledge, and resources necessary to carry out the scenario. We apply the concept of intrusion areas and classify electromagnetic source technology according to key attributes. Worst‐case scenarios are identified for different quantities of attacker resources. The most plausible and consequential of these are deemed the most important scenarios and should provide useful decision support in a countermeasures effort. Finally, an example of the proposed risk assessment framework, based on notional data, is provided on a hypothetical water distribution network.  相似文献   

14.
This paper provides a transdisciplinary critical review of the literature on maternity management in small and medium‐sized enterprises (SMEs), embedded within the wider literatures on maternity in the workplace. The key objectives are to describe what is known about the relations that shape maternity management in smaller workplaces and to identify research directions to enhance this knowledge. The review is guided by theory of organizational gendering and small business management, conceptualizing adaptions to maternity as a process of mutual adjustment and dynamic capability within smaller firms’ informally negotiated order, resource endowments and wider labour and product/service markets. A context‐sensitive lens is also applied. The review highlights the complex range of processes involved in SME maternity management and identifies major research gaps in relation to pregnancy, maternity leave and the return to work (family‐friendly working and breastfeeding) in these contexts. This blind spot is surprising, as SMEs employ the majority of women worldwide. A detailed agenda for future research is outlined, building on the gaps identified by the review and founded on renewed theoretical direction.  相似文献   

15.
Current factory design and evaluation is very primitive. Factory components are designed in many cases independently. Product and process design are not well integrated. An encompassing framework is needed for iterating through a series of total factory designs, searching for optimal performance. In addition, a vehicle is needed for predicting the performance of a proposed advanced manufacturing system, so that engineers may have a sound means for evaluating such proposals. A heterarchical discrete manufacturing SIMNET II simulation model (SIMCELLS) was developed as a comprehensive methodology for designing and evaluating discrete manufacturing systems. SIMCELLS allows manufacturing systems engineers to experiment with alternative system structures and control strategies while seeking that combination of design features that will produce the desired overall system performance. The model in combination with a modernization programme is enabling a firm to successfully manufacture and sell trucks meeting international standards. The SIMNET II model  相似文献   

16.
Managers must regularly make decisions on how to access and deploy their limited resources in order to build organizational capabilities for a sustainable competitive advantage. However, failure to recognize that organizational capabilities involve complex and intricately woven underlying processes may lead to an incomplete understanding of how capabilities affect competitive advantage. As a means of understanding this underlying complexity, we discuss how managerial decisions on resource acquisition and deployment influence capability embeddedness and argue that capability embeddedness has an incremental effect on firm performance beyond the effects from organizational resources and capabilities. To investigate these issues, we present a hierarchical composed error structure framework that relies on cross‐sectional data (and allows for generalizations to panel data). We demonstrate the framework in the context of retailing, where we show that the embeddedness of organizational capabilities influences retailer performance above and beyond the tangible and intangible resources and capabilities that a retailer possesses. Our results illustrate that understanding how resources and capabilities influence performance at different hierarchical levels within a firm can aid managers to make better decisions on how they can embed certain capabilities within the structural and social relationships within the firm. Moreover, understanding whether the underlying objectives of the capabilities that are being built and cultivated have convergent or divergent goals is critical, as it can influence the extent to which the embedded capabilities enhance firm performance.  相似文献   

17.
Outbreaks of contagious diseases underscore the ever‐looming threat of new epidemics. Compared to other disasters that inflict physical damage to infrastructure systems, epidemics can have more devastating and prolonged impacts on the population. This article investigates the interdependent economic and productivity risks resulting from epidemic‐induced workforce absenteeism. In particular, we develop a dynamic input‐output model capable of generating sector‐disaggregated economic losses based on different magnitudes of workforce disruptions. An ex post analysis of the 2009 H1N1 pandemic in the national capital region (NCR) reveals the distribution of consequences across different economic sectors. Consequences are categorized into two metrics: (i) economic loss, which measures the magnitude of monetary losses incurred in each sector, and (ii) inoperability, which measures the normalized monetary losses incurred in each sector relative to the total economic output of that sector. For a simulated mild pandemic scenario in NCR, two distinct rankings are generated using the economic loss and inoperability metrics. Results indicate that the majority of the critical sectors ranked according to the economic loss metric comprise of sectors that contribute the most to the NCR's gross domestic product (e.g., federal government enterprises). In contrast, the majority of the critical sectors generated by the inoperability metric include sectors that are involved with epidemic management (e.g., hospitals). Hence, prioritizing sectors for recovery necessitates consideration of the balance between economic loss, inoperability, and other objectives. Although applied specifically to the NCR, the proposed methodology can be customized for other regions.  相似文献   

18.
Joost R. Santos 《Risk analysis》2012,32(10):1673-1692
Disruptions in the production of commodities and services resulting from disasters influence the vital functions of infrastructure and economic sectors within a region. The interdependencies inherent among these sectors trigger the faster propagation of disaster consequences that are often associated with a wider range of inoperability and amplified losses. This article evaluates the impact of inventory‐enhanced policies for disrupted interdependent sectors to improve the disaster preparedness capability of dynamic inoperability input‐output models (DIIM). In this article, we develop the dynamic cross‐prioritization plot (DCPP)—a prioritization methodology capable of identifying and dynamically updating the critical sectors based on preference assignments to different objectives. The DCPP integrates the risk assessment metrics (e.g., economic loss and inoperability), which are independently analyzed in the DIIM. We develop a computer‐based DCPP tool to determine the priority for inventory enhancement with user preference and resource availability as new dimensions. A baseline inventory case for the state of Virginia revealed a high concentration of (i) manufacturing sectors under the inoperability objective and (ii) service sectors under the economic loss objective. Simulation of enhanced inventory policies for selected critical manufacturing sectors has reduced the recovery period by approximately four days and the expected total economic loss by $33 million. Although the article focuses on enhancing inventory levels in manufacturing sectors, complementary analysis is recommended to manage the resilience of the service sectors. The flexibility of the proposed DCPP as a decision support tool can also be extended to accommodate analysis in other regions and disaster scenarios.  相似文献   

19.
Space weather phenomena have been studied in detail in the peer‐reviewed scientific literature. However, there has arguably been scant analysis of the potential socioeconomic impacts of space weather, despite a growing gray literature from different national studies, of varying degrees of methodological rigor. In this analysis, we therefore provide a general framework for assessing the potential socioeconomic impacts of critical infrastructure failure resulting from geomagnetic disturbances, applying it to the British high‐voltage electricity transmission network. Socioeconomic analysis of this threat has hitherto failed to address the general geophysical risk, asset vulnerability, and the network structure of critical infrastructure systems. We overcome this by using a three‐part method that includes (i) estimating the probability of intense magnetospheric substorms, (ii) exploring the vulnerability of electricity transmission assets to geomagnetically induced currents, and (iii) testing the socioeconomic impacts under different levels of space weather forecasting. This has required a multidisciplinary approach, providing a step toward the standardization of space weather risk assessment. We find that for a Carrington‐sized 1‐in‐100‐year event with no space weather forecasting capability, the gross domestic product loss to the United Kingdom could be as high as £15.9 billion, with this figure dropping to £2.9 billion based on current forecasting capability. However, with existing satellites nearing the end of their life, current forecasting capability will decrease in coming years. Therefore, if no further investment takes place, critical infrastructure will become more vulnerable to space weather. Additional investment could provide enhanced forecasting, reducing the economic loss for a Carrington‐sized 1‐in‐100‐year event to £0.9 billion.  相似文献   

20.
Owing to its inherent modeling flexibility, simulation is often regarded as the proper means for supporting decision making on supply chain design. The ultimate success of supply chain simulation, however, is determined by a combination of the analyst's skills, the chain members' involvement, and the modeling capabilities of the simulation tool. This combination should provide the basis for a realistic simulation model, which is both transparent and complete. The need for transparency is especially strong for supply chains as they involve (semi)autonomous parties each having their own objectives. Mutual trust and model effectiveness are strongly influenced by the degree of completeness of each party's insight into the key decision variables. Ideally, visual interactive simulation models present an important communicative means for realizing the required overview and insight. Unfortunately, most models strongly focus on physical transactions, leaving key decision variables implicit for some or all of the parties involved. This especially applies to control structures, that is, the managers or systems responsible for control, their activities and their mutual attuning of these activities. Control elements are, for example, dispersed over the model, are not visualized, or form part of the time‐indexed scheduling of events. In this article, we propose an alternative approach that explicitly addresses the modeling of control structures. First, we will conduct a literature survey with the aim of listing simulation model qualities essential for supporting successful decision making on supply chain design. Next, we use this insight to define an object‐oriented modeling framework that facilitates supply chain simulation in a more realistic manner. This framework is meant to contribute to improved decision making in terms of recognizing and understanding opportunities for improved supply chain design. Finally, the use of the framework is illustrated by a case example concerning a supply chain for chilled salads.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号