首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 62 毫秒
1.
We consider retail space‐exchange problems where two retailers exchange shelf space to increase accessibility to more of their consumers in more locations without opening new stores. Using the Hotelling model, we find two retailers’ optimal prices, given their host and guest space in two stores under the space‐exchange strategy. Next, using the optimal space‐dependent prices, we analyze a non‐cooperative game, where each retailer makes a space allocation decision for the retailer's own store. We show that the two retailers will implement such a strategy in the game, if and only if their stores are large enough to serve more than one‐half of their consumers. Nash equilibrium for the game exists, and its value depends on consumers’ utilities and trip costs as well as the total available space in each retailer's store. Moreover, as a result of the space‐exchange strategy, each retailer's prices in two stores are both higher than the retailer's price before the space exchange, but they may or may not be identical.  相似文献   

2.
Choice models and neural networks are two approaches used in modeling selection decisions. Defining model performance as the out‐of‐sample prediction power of a model, we test two hypotheses: (i) choice models and neural network models are equal in performance, and (ii) hybrid models consisting of a combination of choice and neural network models perform better than each stand‐alone model. We perform statistical tests for two classes of linear and nonlinear hybrid models and compute the empirical integrated rank (EIR) indices to compare the overall performances of the models. We test the above hypotheses by using data for various brand and store choices for three consumer products. Extensive jackknifing and out‐of‐sample tests for four different model specifications are applied for increasing the external validity of the results. Our results show that using neural networks has a higher probability of resulting in a better performance. Our findings also indicate that hybrid models outperform stand‐alone models, in that using hybrid models guarantee overall results equal or better than the two stand‐alone models. The improvement is particularly significant in cases where neither of the two stand‐alone models is very accurate in prediction, indicating that the proposed hybrid models may capture aspects of predictive accuracy that neither stand‐alone model is capable of on their own. Our results are particularly important in brand management and customer relationship management, indicating that multiple technologies and mixture of technologies may yield more accurate and reliable outcomes than individual ones.  相似文献   

3.
Alexander H. Hübner 《Omega》2012,40(2):199-209
Retail requires efficient decision support to manage increasing product proliferation and various consumer choice effects with limited shelf space. Our goal is to identify, describe and compare decision support systems for category planning. This research analyzes quantitative models and software applications in assortment and shelf space management and contributes to a more integrated modeling approach. There are difficulties commonly involved in the use of commercial software and the implementation and transfer of scientific models. Scientific decision models either focus on space-dependent demand or substitution effects, whereas software applications use simplistic rules of thumb. We show that retail assortment planning models neglect space-elastic demand and largely also ignore constraints of limited shelf space. Shelf space management streams on the other hand, mostly omit substitution effects between products when products are delisted orout-of-stock, which is the focus of consumer choice models in assortment planning. Also, the problem sizes of the models are often not relevant for realistic category sizes. Addressing these issues, this paper provides a state-of-the-art overview and research framework for integrated assortment and shelf space planning.  相似文献   

4.
In this study, a variance‐based global sensitivity analysis method was first applied to a contamination assessment model of Listeria monocytogenes in cold smoked vacuum packed salmon at consumption. The impact of the choice of the modeling approach (populational or cellular) of the primary and secondary models as well as the effect of their associated input factors on the final contamination level was investigated. Results provided a subset of important factors, including the food water activity, its storage temperature, and duration in the domestic refrigerator. A refined sensitivity analysis was then performed to rank the important factors, tested over narrower ranges of variation corresponding to their current distributions, using three techniques: ANOVA, Spearman correlation coefficient, and partial least squares regression. Finally, the refined sensitivity analysis was used to rank the important factors.  相似文献   

5.
Yacov Y Haimes 《Risk analysis》2012,32(11):1834-1845
Natural and human‐induced disasters affect organizations in myriad ways because of the inherent interconnectedness and interdependencies among human, cyber, and physical infrastructures, but more importantly, because organizations depend on the effectiveness of people and on the leadership they provide to the organizations they serve and represent. These human–organizational–cyber–physical infrastructure entities are termed systems of systems. Given the multiple perspectives that characterize them, they cannot be modeled effectively with a single model. The focus of this article is: (i) the centrality of the states of a system in modeling; (ii) the efficacious role of shared states in modeling systems of systems, in identification, and in the meta‐modeling of systems of systems; and (iii) the contributions of the above to strategic preparedness, response to, and recovery from catastrophic risk to such systems. Strategic preparedness connotes a decision‐making process and its associated actions. These must be: implemented in advance of a natural or human‐induced disaster, aimed at reducing consequences (e.g., recovery time, community suffering, and cost), and/or controlling their likelihood to a level considered acceptable (through the decisionmakers’ implicit and explicit acceptance of various risks and tradeoffs). The inoperability input‐output model (IIM), which is grounded on Leontief's input/output model, has enabled the modeling of interdependent subsystems. Two separate modeling structures are introduced. These are: phantom system models (PSM), where shared states constitute the essence of modeling coupled systems; and the IIM, where interdependencies among sectors of the economy are manifested by the Leontief matrix of technological coefficients. This article demonstrates the potential contributions of these two models to each other, and thus to more informative modeling of systems of systems schema. The contributions of shared states to this modeling and to systems identification are presented with case studies.  相似文献   

6.
Evaluations of Listeria monocytogenes dose‐response relationships are crucially important for risk assessment and risk management, but are complicated by considerable variability across population subgroups and L. monocytogenes strains. Despite difficulties associated with the collection of adequate data from outbreak investigations or sporadic cases, the limitations of currently available animal models, and the inability to conduct human volunteer studies, some of the available data now allow refinements of the well‐established exponential L. monocytogenes dose response to more adequately represent extremely susceptible population subgroups and highly virulent L. monocytogenes strains. Here, a model incorporating adjustments for variability in L. monocytogenes strain virulence and host susceptibility was derived for 11 population subgroups with similar underlying comorbidities using data from multiple sources, including human surveillance and food survey data. In light of the unique inherent properties of L. monocytogenes dose response, a lognormal‐Poisson dose‐response model was chosen, and proved able to reconcile dose‐response relationships developed based on surveillance data with outbreak data. This model was compared to a classical beta‐Poisson dose‐response model, which was insufficiently flexible for modeling the specific case of L. monocytogenes dose‐response relationships, especially in outbreak situations. Overall, the modeling results suggest that most listeriosis cases are linked to the ingestion of food contaminated with medium to high concentrations of L. monocytogenes. While additional data are needed to refine the derived model and to better characterize and quantify the variability in L. monocytogenes strain virulence and individual host susceptibility, the framework derived here represents a promising approach to more adequately characterize the risk of listeriosis in highly susceptible population subgroups.  相似文献   

7.
Louis Anthony Cox  Jr. 《Risk analysis》2009,29(8):1062-1068
Risk analysts often analyze adversarial risks from terrorists or other intelligent attackers without mentioning game theory. Why? One reason is that many adversarial situations—those that can be represented as attacker‐defender games, in which the defender first chooses an allocation of defensive resources to protect potential targets, and the attacker, knowing what the defender has done, then decides which targets to attack—can be modeled and analyzed successfully without using most of the concepts and terminology of game theory. However, risk analysis and game theory are also deeply complementary. Game‐theoretic analyses of conflicts require modeling the probable consequences of each choice of strategies by the players and assessing the expected utilities of these probable consequences. Decision and risk analysis methods are well suited to accomplish these tasks. Conversely, game‐theoretic formulations of attack‐defense conflicts (and other adversarial risks) can greatly improve upon some current risk analyses that attempt to model attacker decisions as random variables or uncertain attributes of targets (“threats”) and that seek to elicit their values from the defender's own experts. Game theory models that clarify the nature of the interacting decisions made by attackers and defenders and that distinguish clearly between strategic choices (decision nodes in a game tree) and random variables (chance nodes, not controlled by either attacker or defender) can produce more sensible and effective risk management recommendations for allocating defensive resources than current risk scoring models. Thus, risk analysis and game theory are (or should be) mutually reinforcing.  相似文献   

8.
Willingness To Pay (WTP) of customers plays an anchoring role in pricing. This study proposes a new choice model based on WTP, incorporating sequential decision making, where the products with positive utility of purchase are considered in the order of customer preference. We compare WTP‐choice model with the commonly used (multinomial) Logit model with respect to the underlying choice process, information requirement, and independence of irrelevant alternatives. Using WTP‐choice model, we find and compare equilibrium and centrally optimal prices and profits without considering inventory availability. In addition, we compare equilibrium prices and profits in two contexts: without considering inventory availability and under lost sales. One of the interesting results with WTP‐choice model is the “loose coupling” of retailers in competition; prices are not coupled but profits are. That is, each retailer should charge the monopoly price as the collection of these prices constitute an equilibrium but each retailer's profit depends on other retailers' prices. Loose coupling fails with dependence of WTPs or dependence of preference on prices. Also, we show that competition among retailers facing dependent WTPs can cause price cycles under some conditions. We consider real‐life data on sales of yogurt, ketchup, candy melt, and tuna, and check if a version of WTP‐choice model (with uniform, triangle, or shifted exponential WTP distribution), standard or mixed Logit model fits better and predicts the sales better. These empirical tests establish that WTP‐choice model compares well and should be considered as a legitimate alternative to Logit models for studying pricing for products with low price and high frequency of purchase.  相似文献   

9.
Traditional discrete‐choice models assume buyers are aware of all products for sale. In markets where products change rapidly, the full information assumption is untenable. I present a discrete‐choice model of limited consumer information, where advertising influences the set of products from which consumers choose to purchase. I apply the model to the U.S. personal computer market where top firms spend over $2 billion annually on advertising. I find estimated markups of 19% over production costs, where top firms advertise more than average and earn higher than average markups. High markups are explained to a large extent by informational asymmetries across consumers, where full information models predict markups of one‐fourth the magnitude. I find that estimated product demand curves are biased toward being too elastic under traditional models. I show how to use data on media exposure to improve estimated price elasticities in the absence of micro ad data.  相似文献   

10.
Vulnerability of human beings exposed to a catastrophic disaster is affected by multiple factors that include hazard intensity, environment, and individual characteristics. The traditional approach to vulnerability assessment, based on the aggregate‐area method and unsupervised learning, cannot incorporate spatial information; thus, vulnerability can be only roughly assessed. In this article, we propose Bayesian network (BN) and spatial analysis techniques to mine spatial data sets to evaluate the vulnerability of human beings. In our approach, spatial analysis is leveraged to preprocess the data; for example, kernel density analysis (KDA) and accumulative road cost surface modeling (ARCSM) are employed to quantify the influence of geofeatures on vulnerability and relate such influence to spatial distance. The knowledge‐ and data‐based BN provides a consistent platform to integrate a variety of factors, including those extracted by KDA and ARCSM to model vulnerability uncertainty. We also consider the model's uncertainty and use the Bayesian model average and Occam's Window to average the multiple models obtained by our approach to robust prediction of the risk and vulnerability. We compare our approach with other probabilistic models in the case study of seismic risk and conclude that our approach is a good means to mining spatial data sets for evaluating vulnerability.  相似文献   

11.
Researchers in judgment and decision making have long debunked the idea that we are economically rational optimizers. However, problematic assumptions of rationality remain common in studies of agricultural economics and climate change adaptation, especially those that involve quantitative models. Recent movement toward more complex agent‐based modeling provides an opportunity to reconsider the empirical basis for farmer decision making. Here, we reconceptualize farmer decision making from the ground up, using an in situ mental models approach to analyze weather and climate risk management. We assess how large‐scale commercial grain farmers in South Africa (n = 90) coordinate decisions about weather, climate variability, and climate change with those around other environmental, agronomic, economic, political, and personal risks that they manage every day. Contrary to common simplifying assumptions, we show that these farmers tend to satisfice rather than optimize as they face intractable and multifaceted uncertainty; they make imperfect use of limited information; they are differently averse to different risks; they make decisions on multiple time horizons; they are cautious in responding to changing conditions; and their diverse risk perceptions contribute to important differences in individual behaviors. We find that they use two important nonoptimizing strategies, which we call cognitive thresholds and hazy hedging, to make practical decisions under pervasive uncertainty. These strategies, evident in farmers' simultaneous use of conservation agriculture and livestock to manage weather risks, are the messy in situ performance of naturalistic decision‐making techniques. These results may inform continued research on such behavioral tendencies in narrower lab‐ and modeling‐based studies.  相似文献   

12.
In the quest to model various phenomena, the foundational importance of parameter identifiability to sound statistical modeling may be less well appreciated than goodness of fit. Identifiability concerns the quality of objective information in data to facilitate estimation of a parameter, while nonidentifiability means there are parameters in a model about which the data provide little or no information. In purely empirical models where parsimonious good fit is the chief concern, nonidentifiability (or parameter redundancy) implies overparameterization of the model. In contrast, nonidentifiability implies underinformativeness of available data in mechanistically derived models where parameters are interpreted as having strong practical meaning. This study explores illustrative examples of structural nonidentifiability and its implications using mechanistically derived models (for repeated presence/absence analyses and dose–response of Escherichia coli O157:H7 and norovirus) drawn from quantitative microbial risk assessment. Following algebraic proof of nonidentifiability in these examples, profile likelihood analysis and Bayesian Markov Chain Monte Carlo with uniform priors are illustrated as tools to help detect model parameters that are not strongly identifiable. It is shown that identifiability should be considered during experimental design and ethics approval to ensure generated data can yield strong objective information about all mechanistic parameters of interest. When Bayesian methods are applied to a nonidentifiable model, the subjective prior effectively fabricates information about any parameters about which the data carry no objective information. Finally, structural nonidentifiability can lead to spurious models that fit data well but can yield severely flawed inferences and predictions when they are interpreted or used inappropriately.  相似文献   

13.
The techniques of financial modelling are becoming more popular and accepted as a useful information processing tool. However, what is ‘financial modelling’ and does the term adequately describe current applications? Why has financial modelling been such a growth area and what are the benefits? Given a desire to build models, where does the manager begin? What type of system and language should be used and by whom? What type of computing facility is most suitable? Which modelling system should be selected and what features are important? As well as an increase in the number of financial modelling applications they are now more complex. What guidelines can one use when designing large and complex models? This article seeks to answer these questions, concentrating on the large and more complex models, particularly for long term planning and budgeting applications. Finally an example is given illustrating how a large modelling system can be constructed and maintained with little technical computer expertise.  相似文献   

14.
Space weather phenomena have been studied in detail in the peer‐reviewed scientific literature. However, there has arguably been scant analysis of the potential socioeconomic impacts of space weather, despite a growing gray literature from different national studies, of varying degrees of methodological rigor. In this analysis, we therefore provide a general framework for assessing the potential socioeconomic impacts of critical infrastructure failure resulting from geomagnetic disturbances, applying it to the British high‐voltage electricity transmission network. Socioeconomic analysis of this threat has hitherto failed to address the general geophysical risk, asset vulnerability, and the network structure of critical infrastructure systems. We overcome this by using a three‐part method that includes (i) estimating the probability of intense magnetospheric substorms, (ii) exploring the vulnerability of electricity transmission assets to geomagnetically induced currents, and (iii) testing the socioeconomic impacts under different levels of space weather forecasting. This has required a multidisciplinary approach, providing a step toward the standardization of space weather risk assessment. We find that for a Carrington‐sized 1‐in‐100‐year event with no space weather forecasting capability, the gross domestic product loss to the United Kingdom could be as high as £15.9 billion, with this figure dropping to £2.9 billion based on current forecasting capability. However, with existing satellites nearing the end of their life, current forecasting capability will decrease in coming years. Therefore, if no further investment takes place, critical infrastructure will become more vulnerable to space weather. Additional investment could provide enhanced forecasting, reducing the economic loss for a Carrington‐sized 1‐in‐100‐year event to £0.9 billion.  相似文献   

15.
Damage models for natural hazards are used for decision making on reducing and transferring risk. The damage estimates from these models depend on many variables and their complex sometimes nonlinear relationships with the damage. In recent years, data‐driven modeling techniques have been used to capture those relationships. The available data to build such models are often limited. Therefore, in practice it is usually necessary to transfer models to a different context. In this article, we show that this implies the samples used to build the model are often not fully representative for the situation where they need to be applied on, which leads to a “sample selection bias.” In this article, we enhance data‐driven damage models by applying methods, not previously applied to damage modeling, to correct for this bias before the machine learning (ML) models are trained. We demonstrate this with case studies on flooding in Europe, and typhoon wind damage in the Philippines. Two sample selection bias correction methods from the ML literature are applied and one of these methods is also adjusted to our problem. These three methods are combined with stochastic generation of synthetic damage data. We demonstrate that for both case studies, the sample selection bias correction techniques reduce model errors, especially for the mean bias error this reduction can be larger than 30%. The novel combination with stochastic data generation seems to enhance these techniques. This shows that sample selection bias correction methods are beneficial for damage model transfer.  相似文献   

16.
Dose‐response models are essential to quantitative microbial risk assessment (QMRA), providing a link between levels of human exposure to pathogens and the probability of negative health outcomes. In drinking water studies, the class of semi‐mechanistic models known as single‐hit models, such as the exponential and the exact beta‐Poisson, has seen widespread use. In this work, an attempt is made to carefully develop the general mathematical single‐hit framework while explicitly accounting for variation in (1) host susceptibility and (2) pathogen infectivity. This allows a precise interpretation of the so‐called single‐hit probability and precise identification of a set of statistical independence assumptions that are sufficient to arrive at single‐hit models. Further analysis of the model framework is facilitated by formulating the single‐hit models compactly using probability generating and moment generating functions. Among the more practically relevant conclusions drawn are: (1) for any dose distribution, variation in host susceptibility always reduces the single‐hit risk compared to a constant host susceptibility (assuming equal mean susceptibilities), (2) the model‐consistent representation of complete host immunity is formally demonstrated to be a simple scaling of the response, (3) the model‐consistent expression for the total risk from repeated exposures deviates (gives lower risk) from the conventional expression used in applications, and (4) a model‐consistent expression for the mean per‐exposure dose that produces the correct total risk from repeated exposures is developed.  相似文献   

17.
Critical infrastructures provide society with services essential to its functioning, and extensive disruptions give rise to large societal consequences. Risk and vulnerability analyses of critical infrastructures generally focus narrowly on the infrastructure of interest and describe the consequences as nonsupplied commodities or the cost of unsupplied commodities; they rarely holistically consider the larger impact with respect to higher‐order consequences for the society. From a societal perspective, this narrow focus may lead to severe underestimation of the negative effects of infrastructure disruptions. To explore this theory, an integrated modeling approach, combining models of critical infrastructures and economic input–output models, is proposed and applied in a case study. In the case study, a representative model of the Swedish power transmission system and a regionalized economic input–output model are utilized. This enables exploration of how a narrow infrastructure or a more holistic societal consequence perspective affects vulnerability‐related mitigation decisions regarding critical infrastructures. Two decision contexts related to prioritization of different vulnerability‐reducing measures are considered—identifying critical components and adding system components to increase robustness. It is concluded that higher‐order societal consequences due to power supply disruptions can be up to twice as large as first‐order consequences, which in turn has a significant effect on the identification of which critical components are to be protected or strengthened and a smaller effect on the ranking of improvement measures in terms of adding system components to increase system redundancy.  相似文献   

18.
Mergers and acquisitions are extremely sensitive, both within and outside the organizations involved. Confidentiality agreements are therefore essential for allowing teams the ‘space’ to develop potential scenarios for future integration. Despite the importance of confidentiality in practice, the subject has received little coverage in the management literature. By adopting a case‐study approach, this research explores aspects of confidentiality in a four‐year post‐acquisition integration programme in a FTSE100 pharmaceutical company. The paper identifies a range of personal impacts on the signatories, as well as various dimensions of information transfer despite the agreement being in place. Through the use of a metaphor, the research suggests that a confidentiality agreement has many similarities with the properties and characteristics of a bubble. This bubble trope is used to enhance conceptual understanding of confidentiality constraints in an organizational‐change context. The paper concludes by suggesting some ‘key learnings’ in relation to using confidentiality agreements in strategic change programmes such as a merger or acquisition.  相似文献   

19.
Floods are a natural hazard evolving in space and time according to meteorological and river basin dynamics, so that a single flood event can affect different regions over the event duration. This physical mechanism introduces spatio‐temporal relationships between flood records and losses at different locations over a given time window that should be taken into account for an effective assessment of the collective flood risk. However, since extreme floods are rare events, the limited number of historical records usually prevents a reliable frequency analysis. To overcome this limit, we move from the analysis of extreme events to the modeling of continuous stream flow records preserving spatio‐temporal correlation structures of the entire process, and making a more efficient use of the information provided by continuous flow records. The approach is based on the dynamic copula framework, which allows for splitting the modeling of spatio‐temporal properties by coupling suitable time series models accounting for temporal dynamics, and multivariate distributions describing spatial dependence. The model is applied to 490 stream flow sequences recorded across 10 of the largest river basins in central and eastern Europe (Danube, Rhine, Elbe, Oder, Waser, Meuse, Rhone, Seine, Loire, and Garonne). Using available proxy data to quantify local flood exposure and vulnerability, we show that the temporal dependence exerts a key role in reproducing interannual persistence, and thus magnitude and frequency of annual proxy flood losses aggregated at a basin‐wide scale, while copulas allow the preservation of the spatial dependence of losses at weekly and annual time scales.  相似文献   

20.
Dose–response modeling of biological agents has traditionally focused on describing laboratory‐derived experimental data. Limited consideration has been given to understanding those factors that are controlled in a laboratory, but are likely to occur in real‐world scenarios. In this study, a probabilistic framework is developed that extends Brookmeyer's competing‐risks dose–response model to allow for variation in factors such as dose‐dispersion, dose‐deposition, and other within‐host parameters. With data sets drawn from dose–response experiments of inhalational anthrax, plague, and tularemia, we illustrate how for certain cases, there is the potential for overestimation of infection numbers arising from models that consider only the experimental data in isolation.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号