首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
王扬  徐维军  徐寅峰 《管理学报》2011,8(12):1866-1871
运用占线算法与竞争分析方法,研究了资产所有权在一定的租赁时间后转移给承租人的融资租赁问题的最优竞争策略与风险补偿模型。首先给出了该问题的最优离线解;然后,根据约定的租赁时限与购买价格及租赁费用的大小关系,分别给出了3种情形的占线策略及相应的竞争比分析;最后,在AL-BINALI提出的占线风险补偿分析框架下,给出了2种预期形式及相应最优占线收益策略。  相似文献   

2.
It is well recognized that adaptive and flexible flood risk strategies are required to account for future uncertainties. Development of such strategies is, however, a challenge. Climate change alone is a significant complication, but, in addition, complexities exist trying to identify the most appropriate set of mitigation measures, or interventions. There are a range of economic and environmental performance measures that require consideration, and the spatial and temporal aspects of evaluating the performance of these is complex. All these elements pose severe difficulties to decisionmakers. This article describes a decision support methodology that has the capability to assess the most appropriate set of interventions to make in a flood system and the opportune time to make these interventions, given the future uncertainties. The flood risk strategies have been explicitly designed to allow for flexible adaptive measures by capturing the concepts of real options and multiobjective optimization to evaluate potential flood risk management opportunities. A state‐of‐the‐art flood risk analysis tool is employed to evaluate the risk associated to each strategy over future points in time and a multiobjective genetic algorithm is utilized to search for the optimal adaptive strategies. The modeling system has been applied to a reach on the Thames Estuary (London, England), and initial results show the inclusion of flexibility is advantageous, while the outputs provide decisionmakers with supplementary knowledge that previously has not been considered.  相似文献   

3.
Safety analysis of rare events with potentially catastrophic consequences is challenged by data scarcity and uncertainty. Traditional causation‐based approaches, such as fault tree and event tree (used to model rare event), suffer from a number of weaknesses. These include the static structure of the event causation, lack of event occurrence data, and need for reliable prior information. In this study, a new hierarchical Bayesian modeling based technique is proposed to overcome these drawbacks. The proposed technique can be used as a flexible technique for risk analysis of major accidents. It enables both forward and backward analysis in quantitative reasoning and the treatment of interdependence among the model parameters. Source‐to‐source variability in data sources is also taken into account through a robust probabilistic safety analysis. The applicability of the proposed technique has been demonstrated through a case study in marine and offshore industry.  相似文献   

4.
In a systematic process of project risk management, after risk assessment is implemented, the risk analysts encounter the phase of assessment and selection of the project risk response actions (RA). As indicated by many researchers, there are less systematic and well-developed solutions in the area of risk response assessment and selection. The present article introduces a methodology including a modeling approach with the objective of selecting a set of RA that minimizes the undesirable deviation from achieving the project scope. The developed objective function comprises the three key success criteria of a project, namely, time, quality, and cost. Our model integrates overall project management into the project risk response planning (P2RP). Furthermore, the proposed model stresses on an equivalent importance for both "risk" and "response." We believe that applying the proposed model helps the project risk analyst in most effective and efficient manner dealing with his or her complicated RA selection problems. The application of the proposed model was implemented in projects in the construction industry in which it showed tremendous time, cost, and quality improvements.  相似文献   

5.
A de minimis risk management strategy sets a threshold so that risks below the specified level are defined as trivial and exempted from further consideration. The intended purpose is to help avoid inappropriate and wasteful concern with insignificant low-level risks. In most instances a de minimis strategy is likely to have beneficial or innocuous effects, but under certain circumstances large differences may develop between nominal and actual de minimis levels. The potential for such discrepancies illustrates why de minimis (and all other risk management) strategies should be evaluated on the basis of the portfolio of risks that would accumulate from applying such strategies over time, rather than on the apparent reasonableness of any single instance of application.  相似文献   

6.
Quantitative risk analysis (QRA) is a systematic approach for evaluating likelihood, consequences, and risk of adverse events. QRA based on event (ETA) and fault tree analyses (FTA) employs two basic assumptions. The first assumption is related to likelihood values of input events, and the second assumption is regarding interdependence among the events (for ETA) or basic events (for FTA). Traditionally, FTA and ETA both use crisp probabilities; however, to deal with uncertainties, the probability distributions of input event likelihoods are assumed. These probability distributions are often hard to come by and even if available, they are subject to incompleteness (partial ignorance) and imprecision. Furthermore, both FTA and ETA assume that events (or basic events) are independent. In practice, these two assumptions are often unrealistic. This article focuses on handling uncertainty in a QRA framework of a process system. Fuzzy set theory and evidence theory are used to describe the uncertainties in the input event likelihoods. A method based on a dependency coefficient is used to express interdependencies of events (or basic events) in ETA and FTA. To demonstrate the approach, two case studies are discussed.  相似文献   

7.
Paul F. Deisler  Jr. 《Risk analysis》2002,22(3):405-413
The destruction by terrorists of the twin towers of the World Trade Center and major damage wrought to the Pentagon on September 11, 2001, followed closely by the bioterrorist anthrax attacks via the mails raised the question of whether risk analysis might have a place in defending the United States against terrorist attacks. After first reviewing the multifaceted nature of terrorism and the reasons it is likely to become endemic in world society in the long term, just as other areas of crime are endemic, this article surveys several fields of risk analysis, finding possible short- and long-term uses of risk analysis. The areas chiefly considered are: risk communication and chemical, biological, and technological risk analysis. Broad policy and other uses are also considered. The author finds that risk analysis has already played some role, perhaps informally, but he sees the possibility for a much larger, formal one, a role that is centrally important for the present and future of the United States and the world.  相似文献   

8.
9.
Grant Purdy 《Risk analysis》2010,30(6):881-886
Last year saw the publication of IS0 31000:2009, a new globally accepted standard for risk management together with a new, associated vocabulary in ISO Guide 73:2009. These were developed through a consensus‐driven process over four years, through seven drafts, and involving the input of hundreds of risk management professionals around the world. The new standard supports a new, simple way of thinking about risk and risk management and is intended to begin the process of resolving the many inconsistencies and ambiguities that exist between many different approaches and definitions. While most decisionmakers seem to welcome the new standard and it has so far received very good reviews, it does create challenges for those who use language and approaches that are unique to their area of work but different from the new standard and guide. The need for compromise and change is the inevitable consequence of standardization  相似文献   

10.
Emergency response is directly related to the allocation of emergency rescue resources. Efficient emergency response can reduce loss of life and property, limit damage from the primary impact, and minimize damage from derivative impacts. An appropriate risk analysis approach in the event of accidents is one rational way to assist emergency response. In this article, a cellular automata‐based systematic approach for conducting risk analysis in emergency response is presented. Three general rules, i.e., diffusive effect, transporting effect, and dissipative effect, are developed to implement cellular automata transition function. The approach takes multiple social factors such as population density and population sensitivity into consideration and it also considers risk of domino accidents that are increasing due to increasing congestion in industrial complexes of a city and increasing density of human population. In addition, two risk indices, i.e., individual risk and aggregated weighted risk, are proposed to assist decision making for emergency managers during emergency response. Individual risk can be useful to plan evacuation strategies, while aggregated weighted risk can help emergency managers to allocate rescue resources rationally according to the degree of danger in each vulnerable area and optimize emergency response programs.  相似文献   

11.
In November 2001, the Monterey Institute of International Studies convened a workshop on bioterrorism threat assessment and risk management. Risk assessment practitioners from various disciplines, but without specialized knowledge of terrorism, were brought together with security and intelligence threat analysts to stimulate an exchange that could be useful to both communities. This article, prepared by a subset of the participants, comments on the workshop's findings and their implications and makes three recommendations, two short term (use of threat assessment methodologies and vulnerability analysis) and one long term (application of quantitative risk assessment and modeling), regarding the practical application of risk assessment methods to bioterrorism issues.  相似文献   

12.
Driven by market pressures, financial service firms are increasingly partnering with independent vendors to create service networks that deliver greater profits while ensuring high service quality. In the management of call center networks, these partnerships are common and form an integral part of the customer care and marketing strategies in the financial services industry. For a financial services firm, configuring such a call center service network entails determining which partners to select and how to distribute service requests among vendors, while incorporating their capabilities, costs, and revenue‐generating abilities. Motivated by a problem facing a Fortune 500 financial services provider, we develop and apply a novel mixed integer programming model for the service network configuration problem. Our tactical decision support model effectively accounts for the firm's costs by capturing the impact of service requirements on vendor staffing levels and seat requirements, and permits imposing call routing preferences and auxiliary service costs. We implemented the model and applied it to data from an industry partner. Results suggest that our approach can generate considerable cost savings and substantial additional revenues, while ensuring high service quality. Results based on test instances demonstrate similar savings and outperform two rule‐based methods for vendor assignment.  相似文献   

13.
14.
When stricken by a terrorist attack, a war, or a natural disaster, an economic unit or a critical infrastructure may suffer significant loss of productivity. More importantly, due to interdependency or interconnectedness, this initial loss may propagate into other systems and eventually lead to much greater derivative loss. This belongs to what is known as a cascading effect. It is demonstrated in this article that the cascading effect and the derivative loss can be significantly reduced by effective risk management. This is accomplished by deliberately distributing the initial inoperability to other systems so that the total loss (or inoperability) is minimized. The optimal distribution strategy is found by a linear programming technique. The same risk management can also be applied to situations where objectives need to be prioritized. A case study featuring 12 economic sectors illustrates the theory. The result suggests that using the same amount of resources, minimizing risk (inoperability) of infrastructures will generally give rise to highest payoff, whereas overlooking it may result in greatest total loss. The framework developed in this work uses a steady-state approach that applies primarily to managing situations where the attack is catastrophic resulting in very long recovery time.  相似文献   

15.
This article discusses mitigation strategies to protect traffic routes from snow avalanches. Up to now, mitigation of snow avalanches on many roads and railways in the Alps has relied on avalanche sheds, which require large initial investments resulting in high opportunity costs. Therefore, avalanche risk managers have increasingly adopted organizational mitigation measures such as warning systems and closure policies instead. The effectiveness of these measures is, however, greatly dependent on human decisions. In this article, we present a method for optimizing avalanche mitigation for traffic routes in terms of both their risk reduction impact and their net benefit to society. First, we introduce a generic framework for assessing avalanche risk and for quantifying the impact of mitigation. This allows for sound cost-benefit comparisons between alternative mitigation strategies. Second, we illustrate the framework with a case study from Switzerland. Our findings suggest that site-specific characteristics of avalanche paths, as well as the economic importance of a traffic route, are decisive for the choice of optimal mitigation strategies. On routes endangered by few avalanche paths with frequent avalanche occurrences, structural measures are most efficient, whereas reliance on organizational mitigation is often the most appropriate strategy on routes endangered by many paths with infrequent or fuzzy avalanche risk. Finally, keeping a traffic route open may be very important for tourism or the transport industry. Hence, local economic value may promote the use of a hybrid strategy that combines organizational and structural measures to optimize the resource allocation of avalanche risk mitigation.  相似文献   

16.
With the growing influence of online social media, firms increasingly take an active role in interacting with consumers in social media. For many firms, their first step in online social media is management responses, where the management responds to customers' comments about the firm or its products and services. In this article, we measure the impact of management responses on customer satisfaction using data retrieved from a major online travel agency in China. Applying a panel data model that controls for regression toward the mean and heterogeneity in individual preference for hotels, we find that online management responses are highly effective among low satisfaction customers but have limited influence on other customers. Moreover, we show that the public nature of online management responses introduces a new dynamic among customers. Although online management responses increase future satisfaction of the complaining customers who receive the responses, they decrease future satisfaction of complaining customers who observe but do not receive management responses. The result is consistent with the peer‐induced fairness theory.  相似文献   

17.
The application of quantitative microbial risk assessments (QMRAs) to understand and mitigate risks associated with norovirus is increasingly common as there is a high frequency of outbreaks worldwide. A key component of QMRA is the dose–response analysis, which is the mathematical characterization of the association between dose and outcome. For Norovirus, multiple dose–response models are available that assume either a disaggregated or an aggregated intake dose. This work reviewed the dose–response models currently used in QMRA, and compared predicted risks from waterborne exposures (recreational and drinking) using all available dose–response models. The results found that the majority of published QMRAs of norovirus use the 1F1 hypergeometric dose–response model with α = 0.04, β = 0.055. This dose–response model predicted relatively high risk estimates compared to other dose–response models for doses in the range of 1–1,000 genomic equivalent copies. The difference in predicted risk among dose–response models was largest for small doses, which has implications for drinking water QMRAs where the concentration of norovirus is low. Based on the review, a set of best practices was proposed to encourage the careful consideration and reporting of important assumptions in the selection and use of dose–response models in QMRA of norovirus. Finally, in the absence of one best norovirus dose–response model, multiple models should be used to provide a range of predicted outcomes for probability of infection.  相似文献   

18.
In counterterrorism risk management decisions, the analyst can choose to represent terrorist decisions as defender uncertainties or as attacker decisions. We perform a comparative analysis of probabilistic risk analysis (PRA) methods including event trees, influence diagrams, Bayesian networks, decision trees, game theory, and combined methods on the same illustrative examples (container screening for radiological materials) to get insights into the significant differences in assumptions and results. A key tenent of PRA and decision analysis is the use of subjective probability to assess the likelihood of possible outcomes. For each technique, we compare the assumptions, probability assessment requirements, risk levels, and potential insights for risk managers. We find that assessing the distribution of potential attacker decisions is a complex judgment task, particularly considering the adaptation of the attacker to defender decisions. Intelligent adversary risk analysis and adversarial risk analysis are extensions of decision analysis and sequential game theory that help to decompose such judgments. These techniques explicitly show the adaptation of the attacker and the resulting shift in risk based on defender decisions.  相似文献   

19.
International regulatory authorities view risk management as an essential production need for the development of innovative, somatic cell‐based therapies in regenerative medicine. The available risk management guidelines, however, provide little guidance on specific risk analysis approaches and procedures applicable in clinical cell therapy manufacturing. This raises a number of problems. Cell manufacturing is a poorly automated process, prone to operator‐introduced variations, and affected by heterogeneity of the processed organs/tissues and lot‐dependent variability of reagent (e.g., collagenase) efficiency. In this study, the principal challenges faced in a cell‐based product manufacturing context (i.e., high dependence on human intervention and absence of reference standards for acceptable risk levels) are identified and addressed, and a risk management model approach applicable to manufacturing of cells for clinical use is described for the first time. The use of the heuristic and pseudo‐quantitative failure mode and effect analysis/failure mode and critical effect analysis risk analysis technique associated with direct estimation of severity, occurrence, and detection is, in this specific context, as effective as, but more efficient than, the analytic hierarchy process. Moreover, a severity/occurrence matrix and Pareto analysis can be successfully adopted to identify priority failure modes on which to act to mitigate risks. The application of this approach to clinical cell therapy manufacturing in regenerative medicine is also discussed.  相似文献   

20.
Duan Li 《Risk analysis》2012,32(11):1856-1872
Roy pioneers the concept and practice of risk management of disastrous events via his safety‐first principle for portfolio selection. More specifically, his safety‐first principle advocates an optimal portfolio strategy generated from minimizing the disaster probability, while subject to the budget constraint and the mean constraint that the expected final wealth is not less than a preselected disaster level. This article studies the dynamic safety‐first principle in continuous time and its application in asset and liability management. We reveal that the distortion resulting from dropping the mean constraint, as a common practice to approximate the original Roy’s setting, either leads to a trivial case or changes the problem nature completely to a target‐reaching problem, which produces a highly leveraged trading strategy. Recognizing the ill‐posed nature of the corresponding Lagrangian method when retaining the mean constraint, we invoke a wisdom observed from a limited funding‐level regulation of pension funds and modify the original safety‐first formulation accordingly by imposing an upper bound on the funding level. This model revision enables us to solve completely the safety‐first asset‐liability problem by a martingale approach and to derive an optimal policy that follows faithfully the spirit of the safety‐first principle and demonstrates a prominent nature of fighting for the best and preventing disaster from happening.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号