首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 62 毫秒
1.
A key strategic issue in pre‐disaster planning for humanitarian logistics is the pre‐establishment of adequate capacity and resources that enable efficient relief operations. This paper develops a two‐stage stochastic optimization model to guide the allocation of budget to acquire and position relief assets, decisions that typically need to be made well in advance before a disaster strikes. The optimization focuses on minimizing the expected number of casualties, so our model includes first‐stage decisions to represent the expansion of resources such as warehouses, medical facilities with personnel, ramp spaces, and shelters. Second‐stage decisions concern the logistics of the problem, where allocated resources and contracted transportation assets are deployed to rescue critical population (in need of emergency evacuation), deliver required commodities to stay‐back population, and transport the transfer population displaced by the disaster. Because of the uncertainty of the event's location and severity, these and other parameters are represented as scenarios. Computational results on notional test cases provide guidance on budget allocation and prove the potential benefit of using stochastic optimization.  相似文献   

2.
3.
This work aims at investigating multi-criteria modeling frameworks for discrete stochastic facility location problems with single sourcing. We assume that demand is stochastic and also that a service level is imposed. This situation is modeled using a set of probabilistic constraints. We also consider a minimum throughput at the facilities to justify opening them. We investigate two paradigms in terms of multi-criteria optimization: vectorial optimization and goal programming. Additionally, we discuss the joint use of objective functions that are relevant in the context of some humanitarian logistics problems. We apply the general modeling frameworks proposed to the so-called stochastic shelter site location problem. This is a problem emerging in the context of preventive disaster management. We test the models proposed using two real benchmark data sets. The results show that considering uncertainty and multiple objectives in the type of facility location problems investigated leads to solutions that may better support decision making.  相似文献   

4.
Abstract

Over the past decades, there has been increasing interest in studying humanitarian operations management. The mismatch between global humanitarian needs and the resources available, together with chronic vulnerability in many parts of the world, continues to have a direct bearing on the lives of millions of people in need of assistance. It also means that donors have to re-double their efforts to respond to disasters in a more efficient and effective manner. International humanitarian organizations (IHOs) often deal with a mix of disaster response and development programmes simultaneously. This operational mix entails disaster cycle management challenges such as project and programme planning of multi-objective global logistics, balancing earmarked donations for disaster response with budget needs for development programmes, and determining the push-pull boundaries in the supply chain, particularly with the increase in cash transfer programmes. The main purpose of this special issue is to report on research in humanitarian operations management. This special issue attempts to explore and examine the above topical issues at strategic, operational and technical levels.  相似文献   

5.
Humanitarian aid agencies deliver emergency supplies and services to people affected by disasters. Scholars and practitioners have developed modeling approaches to support aid delivery planning, but they have used objective functions with little validation as to the trade‐offs among the multiple goals of aid delivery. We develop a method to value the performance of aid delivery plans based on expert preferences over five key attributes: the amount of cargo delivered, the prioritization of aid by commodity type, the prioritization of aid by delivery location, the speed of delivery, and the operational cost. Through a conjoint analysis survey, we measure the preferences of 18 experienced humanitarian logisticians. The survey results quantify the importance of each attribute and enable the development of a piecewise linear utility function that can be used as an objective function in optimization models. The results show that the amount of cargo delivered is the most valued objective and cost the least important. In addition, experts prioritize more vulnerable communities and more critical commodities, but not to the exclusion of others. With these insights and the experts’ utility functions, better humanitarian objective functions can be developed to enable better aid delivery in emergency response.  相似文献   

6.
Abstract

This article examines the ethical dimensions and implications of recruitment and human resource development that face both local and international aid agencies operating in the context of an emergency response. Focusing on post-tsunami Sri Lanka it contends that, although the rapid proliferation of humanitarian organizations responding to the disaster created a boom in employment opportunities with international agencies, it also resulted in a human resource crisis for local ones and consequently the erosion of national emergency response capacities. It argues that the current recruitment and HRD practices of humanitarian organizations are insensitive to the disaster response needs of local agencies. It concludes with a call for international aid agencies to be more aware of the implications of their recruitment and HRD strategies in disaster-affected countries and recommends a number of ways in which practices could be improved to support, rather than diminish, local capacities.  相似文献   

7.
Yacov Y Haimes 《Risk analysis》2012,32(11):1834-1845
Natural and human‐induced disasters affect organizations in myriad ways because of the inherent interconnectedness and interdependencies among human, cyber, and physical infrastructures, but more importantly, because organizations depend on the effectiveness of people and on the leadership they provide to the organizations they serve and represent. These human–organizational–cyber–physical infrastructure entities are termed systems of systems. Given the multiple perspectives that characterize them, they cannot be modeled effectively with a single model. The focus of this article is: (i) the centrality of the states of a system in modeling; (ii) the efficacious role of shared states in modeling systems of systems, in identification, and in the meta‐modeling of systems of systems; and (iii) the contributions of the above to strategic preparedness, response to, and recovery from catastrophic risk to such systems. Strategic preparedness connotes a decision‐making process and its associated actions. These must be: implemented in advance of a natural or human‐induced disaster, aimed at reducing consequences (e.g., recovery time, community suffering, and cost), and/or controlling their likelihood to a level considered acceptable (through the decisionmakers’ implicit and explicit acceptance of various risks and tradeoffs). The inoperability input‐output model (IIM), which is grounded on Leontief's input/output model, has enabled the modeling of interdependent subsystems. Two separate modeling structures are introduced. These are: phantom system models (PSM), where shared states constitute the essence of modeling coupled systems; and the IIM, where interdependencies among sectors of the economy are manifested by the Leontief matrix of technological coefficients. This article demonstrates the potential contributions of these two models to each other, and thus to more informative modeling of systems of systems schema. The contributions of shared states to this modeling and to systems identification are presented with case studies.  相似文献   

8.
Toxoplasma gondii is a protozoan parasite that is responsible for approximately 24% of deaths attributed to foodborne pathogens in the United States. It is thought that a substantial portion of human T. gondii infections is acquired through the consumption of meats. The dose‐response relationship for human exposures to T. gondii‐infected meat is unknown because no human data are available. The goal of this study was to develop and validate dose‐response models based on animal studies, and to compute scaling factors so that animal‐derived models can predict T. gondii infection in humans. Relevant studies in literature were collected and appropriate studies were selected based on animal species, stage, genotype of T. gondii, and route of infection. Data were pooled and fitted to four sigmoidal‐shaped mathematical models, and model parameters were estimated using maximum likelihood estimation. Data from a mouse study were selected to develop the dose‐response relationship. Exponential and beta‐Poisson models, which predicted similar responses, were selected as reasonable dose‐response models based on their simplicity, biological plausibility, and goodness fit. A confidence interval of the parameter was determined by constructing 10,000 bootstrap samples. Scaling factors were computed by matching the predicted infection cases with the epidemiological data. Mouse‐derived models were validated against data for the dose‐infection relationship in rats. A human dose‐response model was developed as P (d) = 1–exp (–0.0015 × 0.005 × d) or P (d) = 1–(1 + d × 0.003 / 582.414)?1.479. Both models predict the human response after consuming T. gondii‐infected meats, and provide an enhanced risk characterization in a quantitative microbial risk assessment model for this pathogen.  相似文献   

9.
Research has documented that immigrants tend to experience more negative consequences from natural disasters compared to native‐born individuals, although research on how immigrants perceive and respond to natural disaster risks is sparse. We investigated how risk perception and disaster preparedness for natural disasters in immigrants compared to Canadian‐born individuals as justifications for culturally‐adapted risk communication and management. To this end, we analyzed the ratings on natural disaster risk perception beliefs and preparedness behaviors from a nationally representative survey (N = 1,089). Factor analyses revealed three underlying psychological dimensions of risk perception: external responsibility for disaster management, self‐preparedness responsibility, and illusiveness of preparedness. Although immigrants and Canadian‐born individuals shared the three‐factor structure, there were differences in the salience of five risk perception beliefs. Despite these differences, immigrants and Canadian‐born individuals were similar in the level of risk perception dimensions and disaster preparedness. Regression analyses revealed self‐preparedness responsibility and external responsibility for disaster management positively predicted disaster preparedness whereas illusiveness of preparedness negatively predicted disaster preparedness in both groups. Our results showed that immigrants’ risk perception and disaster preparedness were comparable to their Canadian‐born counterparts. That is, immigrant status did not necessarily yield differences in risk perception and disaster preparedness. These social groups may benefit from a risk communication and management strategy that addresses these risk perception dimensions to increase disaster preparedness. Given the diversity of the immigrant population, the model remains to be tested by further population segmentation.  相似文献   

10.
This research investigates the public's trust in risk‐managing organizations after suffering serious damage from a major disaster. It is natural for public trust to decrease in organizations responsible for mitigating the damage. However, what about trust in organizations that address hazards not directly related to the disaster? Based on the results of surveys conducted by a national institute, the Japanese government concluded, in a White Paper on Science and Technology, that the public's trust in scientists declined overall after the 2011 Tohoku Earthquake. Because scientists play a key role in risk assessment and risk management in most areas, one could predict that trust in risk‐managing organizations overall would decrease after a major disaster. The methodology of that survey, however, had limitations that prevented such conclusions. For this research, two surveys were conducted to measure the public's trust in risk‐managing organizations regarding various hazards, before and after the Tohoku Earthquake (n = 1,192 in 2008 and n = 1,138 in 2012). The results showed that trust decreased in risk‐managing organizations that deal with earthquakes and nuclear accidents, whereas trust levels related to many other hazards, especially in areas not touched by the Tohoku Earthquake, remained steady or even increased. These results reject the assertion that distrust rippled through all risk‐managing organizations. The implications of this research are discussed, with the observation that this result is not necessarily gratifying for risk managers because high trust sometimes reduces public preparedness for disasters.  相似文献   

11.
We have reviewed disaster management research papers published in major operations management, management science, operations research, supply chain management and transportation/logistics journals. In reviewing these studies, our objective is to assess and present the macro level “architectural blue print” of disaster management research with the hope that it will attract new researchers and motivate established researchers to contribute to this important field. The secondary objective is to bring this disaster research to the attention of disaster administrators so that disasters are managed more efficiently and more effectively. We have mapped the disaster management research on the following five attributes of a disaster: (1) Disaster Management Function (decision‐making process, prevention and mitigation, evacuation, humanitarian logistics, casualty management, and recovery and restoration), (2) Time of Disaster (before, during and after), (3) Type of Disaster (accidents, earthquakes, floods, hurricanes, landslides, terrorism and wildfires etc.), (4) Data Type (Field and Archival data, Real data and Hypothetical data), and (5) Data Analysis Technique (bidding models, decision analysis, expert systems, fuzzy system analysis, game theory, heuristics, mathematical programming, network flow models, queueing theory, simulation and statistical analysis). We have done cross tabulations of data among these five parameters to gain greater insights into disaster research. Recommendations for future research are provided.  相似文献   

12.
Cryptosporidium human dose‐response data from seven species/isolates are used to investigate six models of varying complexity that estimate infection probability as a function of dose. Previous models attempt to explicitly account for virulence differences among C. parvum isolates, using three or six species/isolates. Four (two new) models assume species/isolate differences are insignificant and three of these (all but exponential) allow for variable human susceptibility. These three human‐focused models (fractional Poisson, exponential with immunity and beta‐Poisson) are relatively simple yet fit the data significantly better than the more complex isolate‐focused models. Among these three, the one‐parameter fractional Poisson model is the simplest but assumes that all Cryptosporidium oocysts used in the studies were capable of initiating infection. The exponential with immunity model does not require such an assumption and includes the fractional Poisson as a special case. The fractional Poisson model is an upper bound of the exponential with immunity model and applies when all oocysts are capable of initiating infection. The beta Poisson model does not allow an immune human subpopulation; thus infection probability approaches 100% as dose becomes huge. All three of these models predict significantly (>10x) greater risk at the low doses that consumers might receive if exposed through drinking water or other environmental exposure (e.g., 72% vs. 4% infection probability for a one oocyst dose) than previously predicted. This new insight into Cryptosporidium risk suggests additional inactivation and removal via treatment may be needed to meet any specified risk target, such as a suggested 10?4 annual risk of Cryptosporidium infection.  相似文献   

13.
Survival models are developed to predict response and time‐to‐response for mortality in rabbits following exposures to single or multiple aerosol doses of Bacillus anthracis spores. Hazard function models were developed for a multiple‐dose data set to predict the probability of death through specifying functions of dose response and the time between exposure and the time‐to‐death (TTD). Among the models developed, the best‐fitting survival model (baseline model) is an exponential dose–response model with a Weibull TTD distribution. Alternative models assessed use different underlying dose–response functions and use the assumption that, in a multiple‐dose scenario, earlier doses affect the hazard functions of each subsequent dose. In addition, published mechanistic models are analyzed and compared with models developed in this article. None of the alternative models that were assessed provided a statistically significant improvement in fit over the baseline model. The general approach utilizes simple empirical data analysis to develop parsimonious models with limited reliance on mechanistic assumptions. The baseline model predicts TTDs consistent with reported results from three independent high‐dose rabbit data sets. More accurate survival models depend upon future development of dose–response data sets specifically designed to assess potential multiple‐dose effects on response and time‐to‐response. The process used in this article to develop the best‐fitting survival model for exposure of rabbits to multiple aerosol doses of B. anthracis spores should have broad applicability to other host–pathogen systems and dosing schedules because the empirical modeling approach is based upon pathogen‐specific empirically‐derived parameters.  相似文献   

14.
Louis Anthony Cox  Jr 《Risk analysis》2008,28(6):1749-1761
Several important risk analysis methods now used in setting priorities for protecting U.S. infrastructures against terrorist attacks are based on the formula: Risk=Threat×Vulnerability×Consequence. This article identifies potential limitations in such methods that can undermine their ability to guide resource allocations to effectively optimize risk reductions. After considering specific examples for the Risk Analysis and Management for Critical Asset Protection (RAMCAP?) framework used by the Department of Homeland Security, we address more fundamental limitations of the product formula. These include its failure to adjust for correlations among its components, nonadditivity of risks estimated using the formula, inability to use risk‐scoring results to optimally allocate defensive resources, and intrinsic subjectivity and ambiguity of Threat, Vulnerability, and Consequence numbers. Trying to directly assess probabilities for the actions of intelligent antagonists instead of modeling how they adaptively pursue their goals in light of available information and experience can produce ambiguous or mistaken risk estimates. Recent work demonstrates that two‐level (or few‐level) hierarchical optimization models can provide a useful alternative to Risk=Threat×Vulnerability×Consequence scoring rules, and also to probabilistic risk assessment (PRA) techniques that ignore rational planning and adaptation. In such two‐level optimization models, defender predicts attacker's best response to defender's own actions, and then chooses his or her own actions taking into account these best responses. Such models appear valuable as practical approaches to antiterrorism risk analysis.  相似文献   

15.
The effect of bioaerosol size was incorporated into predictive dose‐response models for the effects of inhaled aerosols of Francisella tularensis (the causative agent of tularemia) on rhesus monkeys and guinea pigs with bioaerosol diameters ranging between 1.0 and 24 μm. Aerosol‐size‐dependent models were formulated as modification of the exponential and β‐Poisson dose‐response models and model parameters were estimated using maximum likelihood methods and multiple data sets of quantal dose‐response data for which aerosol sizes of inhaled doses were known. Analysis of F. tularensis dose‐response data was best fit by an exponential dose‐response model with a power function including the particle diameter size substituting for the rate parameter k scaling the applied dose. There were differences in the pathogen's aerosol‐size‐dependence equation and models that better represent the observed dose‐response results than the estimate derived from applying the model developed by the International Commission on Radiological Protection (ICRP, 1994) that relies on differential regional lung deposition for human particle exposure.  相似文献   

16.
Spatial and/or temporal clustering of pathogens will invalidate the commonly used assumption of Poisson‐distributed pathogen counts (doses) in quantitative microbial risk assessment. In this work, the theoretically predicted effect of spatial clustering in conventional “single‐hit” dose‐response models is investigated by employing the stuttering Poisson distribution, a very general family of count distributions that naturally models pathogen clustering and contains the Poisson and negative binomial distributions as special cases. The analysis is facilitated by formulating the dose‐response models in terms of probability generating functions. It is shown formally that the theoretical single‐hit risk obtained with a stuttering Poisson distribution is lower than that obtained with a Poisson distribution, assuming identical mean doses. A similar result holds for mixed Poisson distributions. Numerical examples indicate that the theoretical single‐hit risk is fairly insensitive to moderate clustering, though the effect tends to be more pronounced for low mean doses. Furthermore, using Jensen's inequality, an upper bound on risk is derived that tends to better approximate the exact theoretical single‐hit risk for highly overdispersed dose distributions. The bound holds with any dose distribution (characterized by its mean and zero inflation index) and any conditional dose‐response model that is concave in the dose variable. Its application is exemplified with published data from Norovirus feeding trials, for which some of the administered doses were prepared from an inoculum of aggregated viruses. The potential implications of clustering for dose‐response assessment as well as practical risk characterization are discussed.  相似文献   

17.
We design experiments to jointly elicit risk and time preferences for the adult Danish population. Since subjects are generally risk averse, we find that joint elicitation provides estimates of discount rates that are significantly lower than those found in previous studies and more in line with what would be considered as a priori reasonable rates. The statistical specification relies on a theoretical framework that involves a latent trade‐off between long‐run optimization and short‐run temptation. Estimation of this specification is undertaken using structural, maximum likelihood methods. Our main results based on exponential discounting are robust to alternative specifications such as hyperbolic discounting. These results have direct implications for attempts to elicit time preferences, as well as debates over the appropriate domain of the utility function when characterizing risk aversion and time consistency.  相似文献   

18.
Q fever is a zoonotic disease caused by the intracellular gram‐negative bacterium Coxiella burnetii (C. burnetii), which only multiplies within the phagolysosomal vacuoles. Q fever may manifest as acute or chronic disease. The acute form is generally not fatal and manifestes as self‐controlled febrile illness. Chronic Q fever is usually characterized by endocarditis. Many animal models, including humans, have been studied for Q fever infection through various exposure routes. The studies considered different endpoints including death for animal models and clinical signs for human infection. In this article, animal experimental data available in the open literature were fit to suitable dose‐response models using maximum likelihood estimation. Research results for tests of severe combined immunodeficient mice inoculated intraperitoneally (i.p.) with C. burnetii were best estimated with the Beta‐Poisson dose‐response model. Similar inoculation (i.p.) trial outcomes conducted on C57BL/6J mice were best fit by an exponential model, whereas those tests run on C57BL/10ScN mice were optimally represented by a Beta‐Poisson dose‐response model.  相似文献   

19.
Dose‐response models are essential to quantitative microbial risk assessment (QMRA), providing a link between levels of human exposure to pathogens and the probability of negative health outcomes. In drinking water studies, the class of semi‐mechanistic models known as single‐hit models, such as the exponential and the exact beta‐Poisson, has seen widespread use. In this work, an attempt is made to carefully develop the general mathematical single‐hit framework while explicitly accounting for variation in (1) host susceptibility and (2) pathogen infectivity. This allows a precise interpretation of the so‐called single‐hit probability and precise identification of a set of statistical independence assumptions that are sufficient to arrive at single‐hit models. Further analysis of the model framework is facilitated by formulating the single‐hit models compactly using probability generating and moment generating functions. Among the more practically relevant conclusions drawn are: (1) for any dose distribution, variation in host susceptibility always reduces the single‐hit risk compared to a constant host susceptibility (assuming equal mean susceptibilities), (2) the model‐consistent representation of complete host immunity is formally demonstrated to be a simple scaling of the response, (3) the model‐consistent expression for the total risk from repeated exposures deviates (gives lower risk) from the conventional expression used in applications, and (4) a model‐consistent expression for the mean per‐exposure dose that produces the correct total risk from repeated exposures is developed.  相似文献   

20.
This study analyzes the trade‐off between funding strategies and operational performance in humanitarian operations. If a Humanitarian Organization (HO) offers donors the option of earmarking their donations, HO should expect an increase in total donations. However, earmarking creates constraints in resource allocation that negatively affect HO's operational performance. We study this trade‐off from the perspective of a single HO that maximizes its expected utility as a function of total donations and operational performance. HO implements disaster response and development programs and it operates in a multi‐donor market with donation uncertainty. Using a model inspired by Scarf's minimax approach and the newsvendor framework, we analyze the strategic interaction between HO and its donors. The numerical section is based on real data from 15 disasters during the period 2012–2013. We find that poor operational performance has a larger effect on HO's utility function when donors are more uncertain about HO's expected needs for disaster response. Interestingly, increasing the public awareness of development programs helps HO to get more non‐earmarked donations for disaster response. Increasing non‐earmarked donations improves HO's operational efficiency, which mitigates the impact of donation uncertainty on HO's utility function.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号