首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 296 毫秒
1.
In August 2012, Hurricane Isaac, a Category 1 hurricane at landfall, caused extensive power outages in Louisiana. The storm brought high winds, storm surge, and flooding to Louisiana, and power outages were widespread and prolonged. Hourly power outage data for the state of Louisiana were collected during the storm and analyzed. This analysis included correlation of hourly power outage figures by zip code with storm conditions including wind, rainfall, and storm surge using a nonparametric ensemble data mining approach. Results were analyzed to understand how correlation of power outages with storm conditions differed geographically within the state. This analysis provided insight on how rainfall and storm surge, along with wind, contribute to power outages in hurricanes. By conducting a longitudinal study of outages at the zip code level, we were able to gain insight into the causal drivers of power outages during hurricanes. Our analysis showed that the statistical importance of storm characteristic covariates to power outages varies geographically. For Hurricane Isaac, wind speed, precipitation, and previous outages generally had high importance, whereas storm surge had lower importance, even in zip codes that experienced significant surge. The results of this analysis can inform the development of power outage forecasting models, which often focus strictly on wind‐related covariates. Our study of Hurricane Isaac indicates that inclusion of other covariates, particularly precipitation, may improve model accuracy and robustness across a range of storm conditions and geography.  相似文献   

2.
Hurricanes frequently cause damage to electric power systems in the United States, leading to widespread and prolonged loss of electric service. Restoring service quickly requires the use of repair crews and materials that must be requested, at considerable cost, prior to the storm. U.S. utilities have struggled to strike a good balance between over‐ and underpreparation largely because of a lack of methods for rigorously estimating the impacts of an approaching hurricane on their systems. Previous work developed methods for estimating the risk of power outages and customer loss of power, with an outage defined as nontransitory activation of a protective device. In this article, we move beyond these previous approaches to directly estimate damage to the electric power system. Our approach is based on damage data from past storms together with regression and data mining techniques to estimate the number of utility poles that will need to be replaced. Because restoration times and resource needs are more closely tied to the number of poles and transformers that need to be replaced than to the number of outages, this pole‐based assessment provides a much stronger basis for prestorm planning by utilities. Our results show that damage to poles during hurricanes can be assessed accurately, provided that adequate past damage data are available. However, the availability of data can, and currently often is, the limiting factor in developing these types of models in practice. Opportunities for further enhancing the damage data recorded during hurricanes are also discussed.  相似文献   

3.
Electric power is a critical infrastructure service after hurricanes, and rapid restoration of electric power is important in order to minimize losses in the impacted areas. However, rapid restoration of electric power after a hurricane depends on obtaining the necessary resources, primarily repair crews and materials, before the hurricane makes landfall and then appropriately deploying these resources as soon as possible after the hurricane. This, in turn, depends on having sound estimates of both the overall severity of the storm and the relative risk of power outages in different areas. Past studies have developed statistical, regression-based approaches for estimating the number of power outages in advance of an approaching hurricane. However, these approaches have either not been applicable for future events or have had lower predictive accuracy than desired. This article shows that a different type of regression model, a generalized additive model (GAM), can outperform the types of models used previously. This is done by developing and validating a GAM based on power outage data during past hurricanes in the Gulf Coast region and comparing the results from this model to the previously used generalized linear models.  相似文献   

4.
This article compares two nonparametric tree‐based models, quantile regression forests (QRF) and Bayesian additive regression trees (BART), for predicting storm outages on an electric distribution network in Connecticut, USA. We evaluated point estimates and prediction intervals of outage predictions for both models using high‐resolution weather, infrastructure, and land use data for 89 storm events (including hurricanes, blizzards, and thunderstorms). We found that spatially BART predicted more accurate point estimates than QRF. However, QRF produced better prediction intervals for high spatial resolutions (2‐km grid cells and towns), while BART predictions aggregated to coarser resolutions (divisions and service territory) more effectively. We also found that the predictive accuracy was dependent on the season (e.g., tree‐leaf condition, storm characteristics), and that the predictions were most accurate for winter storms. Given the merits of each individual model, we suggest that BART and QRF be implemented together to show the complete picture of a storm's potential impact on the electric distribution network, which would allow for a utility to make better decisions about allocating prestorm resources.  相似文献   

5.
In the delivery of health care services, variability in the patient arrival and service processes can cause excessive patient waiting times and poor utilization of facility resources. Based on data collected at a large primary care facility, this paper investigates how several sources of variability affect facility performance. These sources include ancillary tasks performed by the physician, patient punctuality, unscheduled visits to the facility's laboratory or X‐ray services, momentary interruptions of a patient's examination, and examination time variation by patient class. Our results indicate that unscheduled visits to the facility's laboratory or X‐ray services have the largest impact on a physician's idle time. The average patient wait is most affected by how the physician prioritizes completing ancillary tasks, such as telephone calls, relative to examining patients. We also investigate the improvement in system performance offered by using increasing levels of patient information when creating the appointment schedule. We find that the use of policies that sequence patients based on their classification improves system performance by up to 25.5%.  相似文献   

6.
Incident data about disruptions to the electric power grid provide useful information that can be used as inputs into risk management policies in the energy sector for disruptions from a variety of origins, including terrorist attacks. This article uses data from the Disturbance Analysis Working Group (DAWG) database, which is maintained by the North American Electric Reliability Council (NERC), to look at incidents over time in the United States and Canada for the period 1990-2004. Negative binomial regression, logistic regression, and weighted least squares regression are used to gain a better understanding of how these disturbances varied over time and by season during this period, and to analyze how characteristics such as number of customers lost and outage duration are related to different characteristics of the outages. The results of the models can be used as inputs to construct various scenarios to estimate potential outcomes of electric power outages, encompassing the risks, consequences, and costs of such outages.  相似文献   

7.
The U.S. federal government regulates the reliability of bulk power systems, while the reliability of power distribution systems is regulated at a state level. In this article, we review the history of regulating electric service reliability and study the existing reliability metrics, indices, and standards for power transmission and distribution networks. We assess the foundations of the reliability standards and metrics, discuss how they are applied to outages caused by large exogenous disturbances such as natural disasters, and investigate whether the standards adequately internalize the impacts of these events. Our reflections shed light on how existing standards conceptualize reliability, question the basis for treating large‐scale hazard‐induced outages differently from normal daily outages, and discuss whether this conceptualization maps well onto customer expectations. We show that the risk indices for transmission systems used in regulating power system reliability do not adequately capture the risks that transmission systems are prone to, particularly when it comes to low‐probability high‐impact events. We also point out several shortcomings associated with the way in which regulators require utilities to calculate and report distribution system reliability indices. We offer several recommendations for improving the conceptualization of reliability metrics and standards. We conclude that while the approaches taken in reliability standards have made considerable advances in enhancing the reliability of power systems and may be logical from a utility perspective during normal operation, existing standards do not provide a sufficient incentive structure for the utilities to adequately ensure high levels of reliability for end‐users, particularly during large‐scale events.  相似文献   

8.
In this article, we discuss an outage‐forecasting model that we have developed. This model uses very few input variables to estimate hurricane‐induced outages prior to landfall with great predictive accuracy. We also show the results for a series of simpler models that use only publicly available data and can still estimate outages with reasonable accuracy. The intended users of these models are emergency response planners within power utilities and related government agencies. We developed our models based on the method of random forest, using data from a power distribution system serving two states in the Gulf Coast region of the United States. We also show that estimates of system reliability based on wind speed alone are not sufficient for adequately capturing the reliability of system components. We demonstrate that a multivariate approach can produce more accurate power outage predictions.  相似文献   

9.
This paper reports results from a survey designed to: (1) evaluate changes in industrial pollution prevention practices since the passage of the landmark environmental legislation, the Emergency Planning and Community Right-to-Know Act of 1986, also known as SARA Title III, and (2) identify those factors that may contribute to an industrial facility engaging in pollution prevention and risk communication activities. The survey was conducted under a Cooperative Agreement between the U.S. Environmental Protection Agency and the Columbia University Center for Risk Communication. Evidence from the survey indicates that a wide variety of waste and pollution reduction activities have been undertaken since passage of the Act. Virtually all facilities surveyed in the pulp and paper, chemical, and petroleum and refining industries reported that they had reduced pollutants or wastes on at least one often measures, including reducing toxic air emissions. Most facilities indicated paying more attention to pollution prevention activities as a result of SARA Title III and half reported that their communication activities have also increased.  相似文献   

10.
Narayanan A  Morgan MG 《Risk analysis》2012,32(7):1183-1193
Despite continuing efforts to make the electric power system robust, some risk remains of widespread and extended power outages due to extreme weather or acts of terrorism. One way to alleviate the most serious effects of a prolonged blackout is to find local means to secure the continued provision of critical social services upon which the health and safety of society depend. This article outlines and estimates the incremental cost of a strategy that uses small distributed generation, distribution automation, and smart meters to keep a set of critical social services operational during a prolonged power outage that lasts for days or weeks and extends over hundreds of kilometers.  相似文献   

11.
WK Grassmann 《Omega》1980,8(1):105-112
This paper lists numerical results for a system with two parallel queues in which arriving customers always join the shorter queue. These results enable us to discuss how fast the expected queue length converges toward its equilibrium, provided the arrivals are below the capacity of the service facility. It also shows how fast the expected queue length increases in case the traffic flow is at or above the capacity of the service facility. Finally, it is shown that the system under consideration behaves very similar to a two-server queue.  相似文献   

12.
We consider two capacity choice scenarios for the optimal location of facilities with fixed servers, stochastic demand, and congestion. Motivating applications include virtual call centers, consisting of geographically dispersed centers, walk‐in health clinics, motor vehicle inspection stations, automobile emissions testing stations, and internal service systems. The choice of locations for such facilities influences both the travel cost and waiting times of users. In contrast to most previous research, we explicitly embed both customer travel/connection and delay costs in the objective function and solve the location–allocation problem and choose facility capacities simultaneously. The choice of capacity for a facility that is viewed as a queueing system with Poisson arrivals and exponential service times could mean choosing a service rate for the servers (Scenario 1) or choosing the number of servers (Scenario 2). We express the optimal service rate in closed form in Scenario 1 and the (asymptotically) optimal number of servers in closed form in Scenario 2. This allows us to eliminate both the number of servers and the service rates from the optimization problems, leading to tractable mixed‐integer nonlinear programs. Our computational results show that both problems can be solved efficiently using a Lagrangian relaxation optimization procedure.  相似文献   

13.
The potential for para‐occupational (or take‐home) exposures from contaminated clothing has been recognized for the past 60 years. To better characterize the take‐home asbestos exposure pathway, a study was performed to measure the relationship between airborne chrysotile concentrations in the workplace, the contamination of work clothing, and take‐home exposures and risks. The study included air sampling during two activities: (1) contamination of work clothing by airborne chrysotile (i.e., loading the clothing), and (2) handling and shaking out of the clothes. The clothes were contaminated at three different target airborne chrysotile concentrations (0–0.1 fibers per cubic centimeter [f/cc], 1–2 f/cc, and 2–4 f/cc; two events each for 31–43 minutes; six events total). Arithmetic mean concentrations for the three target loading levels were 0.01 f/cc, 1.65 f/cc, and 2.84 f/cc (National Institute of Occupational Health and Safety [NIOSH] 7402). Following the loading events, six matched 30‐minute clothes‐handling and shake‐out events were conducted, each including 15 minutes of active handling (15‐minute means; 0.014–0.097 f/cc) and 15 additional minutes of no handling (30‐minute means; 0.006–0.063 f/cc). Percentages of personal clothes‐handling TWAs relative to clothes‐loading TWAs were calculated for event pairs to characterize exposure potential during daily versus weekly clothes‐handling activity. Airborne concentrations for the clothes handler were 0.2–1.4% (eight‐hour TWA or daily ratio) and 0.03–0.27% (40‐hour TWA or weekly ratio) of loading TWAs. Cumulative chrysotile doses for clothes handling at airborne concentrations tested were estimated to be consistent with lifetime cumulative chrysotile doses associated with ambient air exposure (range for take‐home or ambient doses: 0.00044–0.105 f/cc year).  相似文献   

14.
Dual-resource constrained queuing systems contain fewer servers than service facilities. This study uses computer simulation to evaluate several server assignment procedures in a dual-resource system. A field study serves as the basis for developing a model with two service facilities in parallel, a single server, and deterministic information access and transfer delays that can be applied to job shops, computer operating systems, and elevators. Several findings, useful in server assignment decision making, resulted from the study. If first-come, first-served sequencing is used, delaying server assignment at a facility until all jobs are completed reduces both the mean and the variance of job flow time. If shortest-process-time-first sequencing is used, an assignment rule is tested that delays a server at a facility until a sufficiently short job is estimated to have arrived elsewhere. This rule performs best overall in terms of both the mean and variance of flow time. Methods to implement this decision rule easily are discussed.  相似文献   

15.
在网络服务系统中,存在由于各种人为因素(恐怖行为、黑客袭击等)导致网络设施服务中断的情况.为抵御有预谋的攻击,需要更加重视如何识别网络系统中的关键设施.结合P-中位选址模型,以设施失效对网络系统运行效率影响最大化为目标,给出针对基于P-中位模型的网络关键设施识别问题(即R-中断模型),并针对该模型提出贪婪搜索、邻域搜索和禁忌搜索3种算法.结合Galvo、Europe 150 和USA 263 等大型的测试实例,对上述算法进行比较分析,得出禁忌搜索算法最有效的结论.最后,结合Europe 150 数据的例子比较了P-中位问题与R-中断问题,认为在选址决策中事先考虑到人为攻击导致的中断问题可以增加网络的抗攻击能力,减少损失.  相似文献   

16.
Hurricane Katrina struck an area dense with industry, causing numerous releases of petroleum and hazardous materials. This study integrates information from a number of sources to describe the frequency, causes, and effects of these releases in order to inform analysis of risk from future hurricanes. Over 200 onshore releases of hazardous chemicals, petroleum, or natural gas were reported. Storm surge was responsible for the majority of petroleum releases and failure of storage tanks was the most common mechanism of release. Of the smaller number of hazardous chemical releases reported, many were associated with flaring from plant startup, shutdown, or process upset. In areas impacted by storm surge, 10% of the facilities within the Risk Management Plan (RMP) and Toxic Release Inventory (TRI) databases and 28% of SIC 1311 facilities experienced accidental releases. In areas subject only to hurricane strength winds, a lower fraction (1% of RMP and TRI and 10% of SIC 1311 facilities) experienced a release while 1% of all facility types reported a release in areas that experienced tropical storm strength winds. Of industrial facilities surveyed, more experienced indirect disruptions such as displacement of workers, loss of electricity and communication systems, and difficulty acquiring supplies and contractors for operations or reconstruction (55%), than experienced releases. To reduce the risk of hazardous material releases and speed the return to normal operations under these difficult conditions, greater attention should be devoted to risk‐based facility design and improved prevention and response planning.  相似文献   

17.
Risk,Media, and Stigma at Rocky Flats   总被引:1,自引:0,他引:1  
Flynn  James  Peters  Ellen  Mertz  C. K.  Slovic  Paul 《Risk analysis》1998,18(6):715-727
Public responses to nuclear technologies are often strongly negative. Events, such as accidents or evidence of unsafe conditions at nuclear facilities, receive extensive and dramatic coverage by the news media. These news stories affect public perceptions of nuclear risks and the geographic areas near nuclear facilities. One result of these perceptions, avoidance behavior, is a form of "technological stigma" that leads to losses in property values near nuclear facilities. The social amplification of risk is a conceptual framework that attempts to explain how stigma is created through media transmission of information about hazardous places and public perceptions and decisions. This paper examines stigma associated with the U.S. Department of Energy's Rocky Flats facility, a major production plant in the nation's nuclear weapons complex, located near Denver, Colorado. This study, based upon newspaper analyses and a survey of Denver area residents, finds that the social amplification theory provides a reasonable framework for understanding the events and public responses that took place in regard to Rocky Flats during a 6-year period, beginning with an FBI raid of the facility in 1989.  相似文献   

18.
We analyze the benefit of production/service capacity sharing for a set of independent firms. Firms have the choice of either operating their own production/service facilities or investing in a facility that is shared. Facilities are modeled as queueing systems with finite service rates. Firms decide on capacity levels (the service rate) to minimize delay costs and capacity investment costs possibly subject to service‐level constraints on delay. If firms decide to operate a shared facility they must also decide on a scheme for sharing the capacity cost. We formulate the problem as a cooperative game and identify settings under which capacity sharing is beneficial and there is a cost allocation that is in the core under either the first‐come, first‐served policy or an optimal priority policy. We show that capacity sharing may not be beneficial in settings where firms have heterogeneous work contents and service variabilities. In such cases, we specify conditions under which capacity sharing may still be beneficial for a subset of the firms.  相似文献   

19.
蓄意突袭以及恐怖袭击会造成设施服务的突然中断成为网络系统的主要危害之一,因此网络设施选址决策应该同时考虑正常和紧急状态下系统的运作成本.本文研究考虑最坏中断损失下的网络设施选址问题,建立了该问题的双层规划模型,上层规划涉及设施选址决策,下层规划研究确定设施位置后,设施中断产生最大损失的问题.本文运用基于拉格朗日松弛的混合遗传算法来求解该双层规划问题.将European150数据集作为研究对象,对比研究了本文研究问题与传统的P-中位选址问题的结果,分析不同选址策略下网络系统的效率被中断影响的程度是不同的.最后通过改变一些关键参数,比如常规运作权重、设施数量、中断设施数量,对相关结果进行了分析.  相似文献   

20.
In the uniform capacitated k-facility location problem (UC-k-FLP), we are given a set of facilities and a set of clients. Every client has a demand. Every facility have an opening cost and an uniform capacity. For each client–facility pair, there is an unit service cost to serve the client with unit demand by the facility. The total demands served by a facility cannot exceed the uniform capacity. We want to open at most k facilities to serve all the demands of the clients without violating the capacity constraint such that the total opening and serving cost is minimized. The main contribution of this work is to present the first combinatorial bi-criteria approximation algorithm for the UC-k-FLP by violating the cardinality constraint.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号