首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 78 毫秒
1.
The observed global sea level rise owing to climate change, coupled with the potential increase in extreme storms, requires a reexamination of existing infrastructural planning, construction, and management practices. Storm surge shows the effects of rising sea levels. The recent super storms that hit the United States (e.g., Hurricane Katrina in 2005, Sandy in 2012, Harvey and Maria in 2017) and China (e.g., Typhoon Haiyan in 2010) inflicted serious loss of life and property. Water level rise (WLR) of local coastal areas is a combination of sea level rise, storm surge, precipitation, and local land subsidence. Quantitative assessments of the impact of WLR include scenario identification, consequence assessment, vulnerability and flooding assessment, and risk management using inventory of assets from coastal areas, particularly population centers, to manage flooding risk and to enhance infrastructure resilience of coastal cities. This article discusses the impact of WLR on urban infrastructures with case studies of Washington, DC, and Shanghai. Based on the flooding risk analysis under possible scenarios, the property loss for Washington, DC, was evaluated, and the impact on the metro system of Shanghai was examined.  相似文献   

2.
Space weather phenomena have been studied in detail in the peer‐reviewed scientific literature. However, there has arguably been scant analysis of the potential socioeconomic impacts of space weather, despite a growing gray literature from different national studies, of varying degrees of methodological rigor. In this analysis, we therefore provide a general framework for assessing the potential socioeconomic impacts of critical infrastructure failure resulting from geomagnetic disturbances, applying it to the British high‐voltage electricity transmission network. Socioeconomic analysis of this threat has hitherto failed to address the general geophysical risk, asset vulnerability, and the network structure of critical infrastructure systems. We overcome this by using a three‐part method that includes (i) estimating the probability of intense magnetospheric substorms, (ii) exploring the vulnerability of electricity transmission assets to geomagnetically induced currents, and (iii) testing the socioeconomic impacts under different levels of space weather forecasting. This has required a multidisciplinary approach, providing a step toward the standardization of space weather risk assessment. We find that for a Carrington‐sized 1‐in‐100‐year event with no space weather forecasting capability, the gross domestic product loss to the United Kingdom could be as high as £15.9 billion, with this figure dropping to £2.9 billion based on current forecasting capability. However, with existing satellites nearing the end of their life, current forecasting capability will decrease in coming years. Therefore, if no further investment takes place, critical infrastructure will become more vulnerable to space weather. Additional investment could provide enhanced forecasting, reducing the economic loss for a Carrington‐sized 1‐in‐100‐year event to £0.9 billion.  相似文献   

3.
The devastating impact by Hurricane Sandy (2012) again showed New York City (NYC) is one of the most vulnerable cities to coastal flooding around the globe. The low‐lying areas in NYC can be flooded by nor'easter storms and North Atlantic hurricanes. The few studies that have estimated potential flood damage for NYC base their damage estimates on only a single, or a few, possible flood events. The objective of this study is to assess the full distribution of hurricane flood risk in NYC. This is done by calculating potential flood damage with a flood damage model that uses many possible storms and surge heights as input. These storms are representative for the low‐probability/high‐impact flood hazard faced by the city. Exceedance probability‐loss curves are constructed under different assumptions about the severity of flood damage. The estimated flood damage to buildings for NYC is between US$59 and 129 millions/year. The damage caused by a 1/100‐year storm surge is within a range of US$2 bn–5 bn, while this is between US$5 bn and 11 bn for a 1/500‐year storm surge. An analysis of flood risk in each of the five boroughs of NYC finds that Brooklyn and Queens are the most vulnerable to flooding. This study examines several uncertainties in the various steps of the risk analysis, which resulted in variations in flood damage estimations. These uncertainties include: the interpolation of flood depths; the use of different flood damage curves; and the influence of the spectra of characteristics of the simulated hurricanes.  相似文献   

4.
We study the effect of railroad access on urban population growth. Using GIS techniques, we match triennial population data for roughly 1,000 cities in 19th‐century Prussia to georeferenced maps of the German railroad network. We find positive short‐ and long‐term effects of having a station on urban growth for different periods during 1840–1871. Causal effects of (potentially endogenous) railroad access on city growth are identified using propensity score matching, instrumental variables, and fixed‐effects estimation techniques. Our instrument identifies exogenous variation in railroad access by constructing straight‐line corridors between nodes. Counterfactual models using pre‐railroad growth yield no evidence to support the hypothesis that railroads appeared as a consequence of a previous growth spurt.  相似文献   

5.
The rise in economic disparity presents significant risks to global social order and the resilience of local communities. However, existing measurement science for economic disparity (e.g., the Gini coefficient) does not explicitly consider a probability distribution with information, deficiencies, and uncertainties associated with the underlying income distribution. This article introduces the quantification of Shannon entropy for income inequality across scales, including national‐, subnational‐, and city‐level data. The probabilistic principles of Shannon entropy provide a new interpretation for uncertainty and risk related to economic disparity. Entropy and information‐based conflict rise as world incomes converge. High‐entropy instances can resemble both happy and prosperous societies as well as a socialist–communist social structure. Low entropy signals high‐risk tipping points for anomaly and conflict detection with higher confidence. Finally, spatial–temporal entropy maps for U.S. cities offer a city risk profiling framework. The results show polarization of household incomes within and across Baltimore, Washington, DC, and San Francisco. Entropy produces reliable results at significantly reduced computational costs than Gini coefficients.  相似文献   

6.
Studies into the impact of top manager change on organization performance have revealed inconsistent findings. Using longitudinal data over a 12‐year period on football organizations, we test for the short‐term and long‐term effects of manager change in comparison to the tenures of incumbent top managers. We find that long incumbent tenures are associated with performance far above the average. But when looking at change events, contrary to theoretical expectations, we find that change in the short term leads to a brief reprieve in poor performance only for performance to deteriorate in the long term as underlying weaknesses once again take hold. Our findings reveal the illusion of a short‐term reprieve and the long‐term consequences of this illusion. We map several implications for research and practice from our work.  相似文献   

7.
The objectives of this study are to understand tradeoffs between forest carbon and timber values, and evaluate the impact of uncertainty in improved forest management (IFM) carbon offset projects to improve forest management decisions. The study uses probabilistic simulation of uncertainty in financial risk for three management scenarios (clearcutting in 45‐ and 65‐year rotations and no harvest) under three carbon price schemes (historic voluntary market prices, cap and trade, and carbon prices set to equal net present value (NPV) from timber‐oriented management). Uncertainty is modeled for value and amount of carbon credits and wood products, the accuracy of forest growth model forecasts, and four other variables relevant to American Carbon Registry methodology. Calculations use forest inventory data from a 1,740 ha forest in western Washington State, using the Forest Vegetation Simulator (FVS) growth model. Sensitivity analysis shows that FVS model uncertainty contributes more than 70% to overall NPV variance, followed in importance by variability in inventory sample (3–14%), and short‐term prices for timber products (8%), while variability in carbon credit price has little influence (1.1%). At regional average land‐holding costs, a no‐harvest management scenario would become revenue‐positive at a carbon credit break‐point price of $14.17/Mg carbon dioxide equivalent (CO2e). IFM carbon projects are associated with a greater chance of both large payouts and large losses to landowners. These results inform policymakers and forest owners of the carbon credit price necessary for IFM approaches to equal or better the business‐as‐usual strategy, while highlighting the magnitude of financial risk and reward through probabilistic simulation.  相似文献   

8.
This paper analyzes a framework in which countries over time pollute and invest in green technologies. Without a climate treaty, the countries pollute too much and invest too little, particularly if intellectual property rights are weak. Nevertheless, short‐term agreements on emission levels then reduce every country's payoff, since countries invest less when they anticipate future negotiations. If intellectual property rights are weak, the agreement should be tougher and more long‐term. Conversely, if the climate agreement happens to be short‐term or absent, intellectual property rights should be strengthened or technological licensing subsidized.  相似文献   

9.
Coastal cities around the world have experienced large costs from major flooding events in recent years. Climate change is predicted to bring an increased likelihood of flooding due to sea level rise and more frequent severe storms. In order to plan future development and adaptation, cities must know the magnitude of losses associated with these events, and how they can be reduced. Often losses are calculated from insurance claims or surveying flood victims. However, this largely neglects the loss due to the disruption of economic activity. We use a forward‐looking dynamic computable general equilibrium model to study how a local economy responds to a flood, focusing on the subsequent recovery/reconstruction. Initial damage is modeled as a shock to the capital stock and recovery requires rebuilding that stock. We apply the model to Vancouver, British Columbia by considering a flood scenario causing total capital damage of $14.6 billion spread across five municipalities. GDP loss relative to a no‐flood scenario is relatively long‐lasting. It is 2.0% ($2.2 billion) in the first year after the flood, 1.7% ($1.9 billion) in the second year, and 1.2% ($1.4 billion) in the fifth year.  相似文献   

10.
Criticisms of patent laws for technological innovations in the United States reveal a multifaceted milieu of problems centered around the protection of short‐term economic gain and individual property rights. In this article, we consider this a conflict between current patent laws and the innovation capabilities of organizations. We propose a solution that enables the company to assure its long‐term survival in the face of these restrictions. This presumes that the firm will at least maintain its innovation capacities while preserving the company's ethical values and those of its social environment. We offer a theoretical model that is designed to help managers and policymakers reorient their governance strategies for managing the innovation process, using the “ethics of responsibility,” which establishes the link to individual moral values at the beginning of a governance process as well as the consequences of a decision. Our integrated causal model of ethical innovation for patents is presented and implications for global organizations and possible solutions for patent law process failure are offered.  相似文献   

11.
12.
The estimated cost of fire in the United States is about $329 billion a year, yet there are gaps in the literature to measure the effectiveness of investment and to allocate resources optimally in fire protection. This article fills these gaps by creating data‐driven empirical and theoretical models to study the effectiveness of nationwide fire protection investment in reducing economic and human losses. The regression between investment and loss vulnerability shows high R2 values (≈0.93). This article also contributes to the literature by modeling strategic (national‐level or state‐level) resource allocation (RA) for fire protection with equity‐efficiency trade‐off considerations, while existing literature focuses on operational‐level RA. This model and its numerical analyses provide techniques and insights to aid the strategic decision‐making process. The results from this model are used to calculate fire risk scores for various geographic regions, which can be used as an indicator of fire risk. A case study of federal fire grant allocation is used to validate and show the utility of the optimal RA model. The results also identify potential underinvestment and overinvestment in fire protection in certain regions. This article presents scenarios in which the model presented outperforms the existing RA scheme, when compared in terms of the correlation of resources allocated with actual number of fire incidents. This article provides some novel insights to policymakers and analysts in fire protection and safety that would help in mitigating economic costs and saving lives.  相似文献   

13.
We argue that one reason why emerging economies borrow short term is that it is cheaper than borrowing long term. This is especially the case during crises, as during these episodes the relative cost of long‐term borrowing increases. We construct a unique database of sovereign bond prices, returns, and issuances at different maturities for 11 emerging economies from 1990 to 2009 and present a set of new stylized facts. On average, these countries pay a higher risk premium on long‐term than on short‐term bonds. During crises, the difference between the two risk premia increases and issuance shifts towards shorter maturities. To illustrate our argument, we present a simple model in which the maturity structure is the outcome of a risk‐sharing problem between an emerging economy subject to rollover crises and risk‐averse international investors.  相似文献   

14.
In the nuclear power industry, Level 3 probabilistic risk assessment (PRA) is used to estimate damage to public health and the environment if a severe accident leads to large radiological release. Current Level 3 PRA does not have an explicit inclusion of social factors and, therefore, it is not possible to perform importance ranking of social factors for risk‐informing emergency preparedness, planning, and response (EPPR). This article offers a methodology for adapting the concept of social vulnerability, commonly used in natural hazard research, in the context of a severe nuclear power plant accident. The methodology has four steps: (1) calculating a hazard‐independent social vulnerability index for the local population; (2) developing a location‐specific representation of the maximum radiological hazard estimated from current Level 3 PRA, in a geographic information system (GIS) environment; (3) developing a GIS‐based socio‐technical risk map by combining the social vulnerability index and the location‐specific radiological hazard; and (4) conducting a risk importance measure analysis to rank the criticality of social factors based on their contribution to the socio‐technical risk. The methodology is applied using results from the 2012 Surry Power Station state‐of‐the‐art reactor consequence analysis. A radiological hazard model is generated from MELCOR accident consequence code system, translated into a GIS environment, and combined with the Center for Disease Control social vulnerability index (SVI). This research creates an opportunity to explicitly consider and rank the criticality of location‐specific SVI themes based on their influence on risk, providing input for EPPR.  相似文献   

15.
We analyzed wildfire exposure for key social and ecological features on the national forests in Oregon and Washington. The forests contain numerous urban interfaces, old growth forests, recreational sites, and habitat for rare and endangered species. Many of these resources are threatened by wildfire, especially in the east Cascade Mountains fire‐prone forests. The study illustrates the application of wildfire simulation for risk assessment where the major threat is from large and rare naturally ignited fires, versus many previous studies that have focused on risk driven by frequent and small fires from anthropogenic ignitions. Wildfire simulation modeling was used to characterize potential wildfire behavior in terms of annual burn probability and flame length. Spatial data on selected social and ecological features were obtained from Forest Service GIS databases and elsewhere. The potential wildfire behavior was then summarized for each spatial location of each resource. The analysis suggested strong spatial variation in both burn probability and conditional flame length for many of the features examined, including biodiversity, urban interfaces, and infrastructure. We propose that the spatial patterns in modeled wildfire behavior could be used to improve existing prioritization of fuel management and wildfire preparedness activities within the Pacific Northwest region.  相似文献   

16.
17.
Opaque pricing is a form of pricing where certain characteristics of the product or service are hidden from the consumer until after purchase. In essence, opaque selling transforms a differentiated good into a commodity. Opaque pricing has become popular in service pricing as it allows firms to sell their differentiated product at higher prices to regular brand loyal customers while simultaneously selling to non‐brand loyal customers at discounted prices. We use a nested logit model in combination with logistic regression and dynamic programming to illustrate how a service firm can optimally set prices on an opaque sales channel. The choice model allows the characterization of consumer trade‐offs when purchasing opaque products while the dynamic programming approach allows the characterization of the optimal pricing policy as a function of inventory and time remaining. We compare optimal prices and expected revenues when dynamic pricing is restricted to daily price changes. We provide an illustrative example using data from an opaque selling mechanism ( Hotwire.com ) and a Washington DC‐based hotel.  相似文献   

18.
《Risk analysis》2018,38(5):1052-1069
This study investigated whether, in the absence of chronic noncancer toxicity data, short‐term noncancer toxicity data can be used to predict chronic toxicity effect levels by focusing on the dose–response relationship instead of a critical effect. Data from National Toxicology Program (NTP) technical reports have been extracted and modeled using the Environmental Protection Agency's Benchmark Dose Software. Best‐fit, minimum benchmark dose (BMD), and benchmark dose lower limits (BMDLs) have been modeled for all NTP pathologist identified significant nonneoplastic lesions, final mean body weight, and mean organ weight of 41 chemicals tested by NTP between 2000 and 2012. Models were then developed at the chemical level using orthogonal regression techniques to predict chronic (two years) noncancer health effect levels using the results of the short‐term (three months) toxicity data. The findings indicate that short‐term animal studies may reasonably provide a quantitative estimate of a chronic BMD or BMDL. This can allow for faster development of human health toxicity values for risk assessment for chemicals that lack chronic toxicity data.  相似文献   

19.
Autonomy is known for its positive effects and its use in management practice. Recently an urgent debate has emerged on its drawbacks on individual outcomes. In this study, we investigate and test a model on the effect on individual learning of an autonomy‐supportive teaching style and its interplay with the learner's previous experience and perceived management support. Specifically, while research has emphasized the positive effect of similar contexts, this study focuses on its differential effect on short‐term and long‐term learning outcomes, challenging the traditional view of autonomy. We also explore how job experience and management support can improve the effects of autonomy on individual learning. We test our model by collecting longitudinal data on a sample of 200 individuals participating in a training programme on managerial skills. Our results show that (1) the extent to which teachers were perceived as autonomy‐supportive presents a linear relationship with short‐term learning outcomes (utility reactions) and a positive curvilinear relationship with training transfer in the long term; (2) learner job experience and perceived management support for learning have a positive moderating effect on the linear relationship between autonomy and learning outcomes.  相似文献   

20.
Failure of critical national infrastructures can result in major disruptions to society and the economy. Understanding the criticality of individual assets and the geographic areas in which they are located is essential for targeting investments to reduce risks and enhance system resilience. Within this study we provide new insights into the criticality of real‐life critical infrastructure networks by integrating high‐resolution data on infrastructure location, connectivity, interdependence, and usage. We propose a metric of infrastructure criticality in terms of the number of users who may be directly or indirectly disrupted by the failure of physically interdependent infrastructures. Kernel density estimation is used to integrate spatially discrete criticality values associated with individual infrastructure assets, producing a continuous surface from which statistically significant infrastructure criticality hotspots are identified. We develop a comprehensive and unique national‐scale demonstration for England and Wales that utilizes previously unavailable data from the energy, transport, water, waste, and digital communications sectors. The testing of 200,000 failure scenarios identifies that hotspots are typically located around the periphery of urban areas where there are large facilities upon which many users depend or where several critical infrastructures are concentrated in one location.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号