首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 46 毫秒
1.
This paper considers the observational implications of social influences on adoption decisions for an environment of perfect foresight adopters. We argue that social influences can produce two observable effects: (1) discontinuities in unconditional adoption curves and (2) pattern reversals in conditional adoption curves, in which earlier adoption is found for one group of actors versus another when fundamentals suggest the reverse ordering should occur; in turn the presence of either of these features can, under weak assumptions, be interpreted as evidence of social influences. As such, these properties are robust implications of social effects. (JEL: C40, D01, O33)  相似文献   

2.

This paper concerns the staffing optimization problem in multi-skill call centers. The objective is to find a minimal cost staffing solution while meeting a target level for the quality of service (QoS) to customers. We consider a staffing problem in which joint chance constraints are imposed on the QoS of the day. Our joint chance-constrained formulation is more rational capturing the correlation between different call types, as compared to separate chance-constrained versions considered in previous studies. We show that, in general, the probability functions in the joint-chance constraints display S-shaped curves, and the optimal solutions should belong to the concave regions of the curves. Thus, we propose an approach combining a heuristic phase to identify solutions lying in the concave part and a simulation-based cut generation phase to create outer-approximations of the probability functions. This allows us to find good staffing solutions satisfying the joint-chance constraints by simulation and linear programming. We test our formulation and algorithm using call center examples of up to 65 call types and 89 agent groups, which shows the benefits of our joint-chance constrained formulation and the advantage of our algorithm over standard ones.

  相似文献   

3.
This research paper proposes new explicit formulas to compute the Tate pairing on Jacobi quartic elliptic curves. We state the first geometric interpretation of the group law on Jacobi quartic curves by presenting the functions which arise in the addition and doubling. We draw together the best possible optimization that can be used to efficiently evaluate the Tate pairing using Jacobi quartic curves. They are competitive with all published formulas for Tate pairing computation using Short Weierstrass or Twisted Edwards curves. Finally we present several examples of pairing-friendly Jacobi quartic elliptic curves which provide optimal Tate pairing.  相似文献   

4.
This paper studies a shape‐invariant Engel curve system with endogenous total expenditure, in which the shape‐invariant specification involves a common shift parameter for each demographic group in a pooled system of nonparametric Engel curves. We focus on the identification and estimation of both the nonparametric shapes of the Engel curves and the parametric specification of the demographic scaling parameters. The identification condition relates to the bounded completeness and the estimation procedure applies the sieve minimum distance estimation of conditional moment restrictions, allowing for endogeneity. We establish a new root mean squared convergence rate for the nonparametric instrumental variable regression when the endogenous regressor could have unbounded support. Root‐n asymptotic normality and semiparametric efficiency of the parametric components are also given under a set of “low‐level” sufficient conditions. Our empirical application using the U.K. Family Expenditure Survey shows the importance of adjusting for endogeneity in terms of both the nonparametric curvatures and the demographic parameters of systems of Engel curves.  相似文献   

5.
The operations of the dispatch department of a manufacturing organization have been analysed in terms of matching the labour supply to the labour demand (as determined by customer orders). The problem differed somewhat from those traditionally associated with dispatch departments since set-up costs and company policy resulted in high stocks being carried. In this context, service level did not refer to the ability to supply a customer from stock, but to the proportion of orders processed within a specified time interval. The operations of the department were simulated using FINSIM—a computer simulation package developed for the BBC microcomputer. FINSIM offers the capability of seeing the simulation proceed graphically on a visual display. The final outcome of the study was an aid to decision-making consisting of a family of curves of service level against the maximum acceptable time allowed to process an order. This set of curves allowed prediction of the quantifiable consequences of any proposed changes in the department. Of particular significance was the use of the curves to help demonstrate that the seemingly important delays to order documentation within the computer department were in fact inconsequential.  相似文献   

6.
Recent studies indicate that absence of the consideration of risk attitudes of decisionmakers in the risk matrix establishment process has become a major limitation. In order to evaluate risk in a more comprehensive manner, an approach to establish risk matrices that integrates risk attitudes based on utility theory is proposed. There are three main steps within this approach: (1) describing risk attitudes of decisionmakers by utility functions, (2) bridging the gap between utility functions and the risk matrix by utility indifference curves, and (3) discretizing utility indifference curves. A complete risk matrix establishment process based on practical investigations is introduced. This process utilizes decisionmakers’ answers to questionnaires to formulate required boundary values for risk matrix establishment and utility functions that effectively quantify their respective risk attitudes.  相似文献   

7.
Several experimental studies have provided evidence that suggest indifference curves have a kink around the current endowment level. These results, which clearly contradict closely held economic doctrines, have led some influential commentators to call for an entirely new economic paradigm to displace conventional neoclassical theory—e.g., prospect theory, which invokes psychological effects. This paper pits neoclassical theory against prospect theory by investigating data drawn from more than 375 subjects actively participating in a well‐functioning marketplace. The pattern of results suggests that prospect theory adequately organizes behavior among inexperienced consumers, but consumers with intense market experience behave largely in accordance with neoclassical predictions. Moreover, the data are consistent with the notion that consumers learn to overcome the endowment effect in situations beyond specific problems they have previously encountered. This “transference of behavior” across domains has important implications in both a positive and normative sense.  相似文献   

8.
David A Huettner 《Omega》1973,1(4):421-450
Economists have traditionally employed one of two alternative methods when analyzing economies of scale: The long run average cost curve (LRAC curve) and the production function. Only the production function concept, however, has been extended beyond a static framework for analysis of scale economies in a dynamic setting. This paper will extend the traditional, static LRAC curve concept by developing an appropriate dynamic framework. This framework will then be used to analyze the shifts of LRAC curves through time in three major American industries: steel making, cement manufacturing, and electric power generation.The empirical and theoretical topics explored in this study raise issues of both managerial and theoretical concern. These issues include: the relationship between economic plant life and plant size; the existence of scale biases in previous studies of scale economies and current depreciation practices; the accuracy and use of construction cost indexes; and the effects of technological change over extended periods of time. The dynamic framework developed in this study serves several useful purposes. For example, it constitutes a first step toward the development of theories that fill the void between the static theory of LRAC curves and the theories of increasing, decreasing, and constant cost industries. Furthermore, many questions, such as optimal plant or firm size, should be answered in a dynamic framework if appropriate managerial or anti-trust issues are to be considered. Finally, this dynamic framework shifts the emphasis of studies of scale economies back to costs and the use of this framework should result in improved corporate planning and decision making.  相似文献   

9.
A "vertical" condensation scheme for discrete probability distribution (DPD) calculations is presented as an alternative to the earlier "horizontal" scheme, an example of which was presented recently by Kurth and Cox. When applied to DPDs over a space of curves, the vertical condensation results in a "regularization" of the "spaghetti" of curves that results from combination operations on such DPDs.  相似文献   

10.

This paper discusses the process of desigining a tabu search-based heuristic for the two-stage flow shop problem with makespan minimization as the primary criterion and the minimization of total flow time as the secondary criterion. A factorial experiment is designed to analyse thoroughly the effects of four different factors, i.e. the initial solution, type of move, size of neighbourhood and the list size, on the performance of the tabu search-based heuristic. Using the techniques of evolution curves, and response tables and response graphs, coupled with the Taguchi method, the best combination of the factors for the tabu search-based heuristic is identified, and the effectiveness of the heuristic algorithm in finding an optimal solution is evaluated by comparing its performance with the best known heuristic to solve this problem.  相似文献   

11.
Class‐based storage is widely studied in the literature and applied in practice. It divides all stored items into a number of classes according to their turnover. A class of items with higher turnover is allocated to a region closer to the warehouse depot. In the literature, it has been shown that the use of more storage classes leads to a shorter travel time for storing and retrieving items. A basic assumption in this literature is that the required storage space for all items equals their average inventory level, which is valid only if an infinite number of items can be stored in each storage region. This study revisits class‐based storage by considering each storage space to contain only a finite number of items. We develop a travel time model and an algorithm that can be used for determining the optimal number and boundaries of storage classes in warehouses. Different from the conventional research, our findings illustrate that commonly a small number of classes is optimal. In addition, we find the travel time is fairly insensitive to the number of storage classes in a wide range around the optimum. This suggests that a manager can select a near‐optimal number of storage classes in an easy way and need not be worried about the impact of storage‐class reconfigurations. We validate our findings for various cases, including different ABC‐demand curves, space‐sharing factors, number of items, storage rack shapes, discrete storage locations, and stochastic item demand.  相似文献   

12.
This paper uses revealed preference inequalities to provide the tightest possible (best) nonparametric bounds on predicted consumer responses to price changes using consumer‐level data over a finite set of relative price changes. These responses are allowed to vary nonparametrically across the income distribution. This is achieved by combining the theory of revealed preference with the semiparametric estimation of consumer expansion paths (Engel curves). We label these expansion path based bounds on demand responses as E‐bounds. Deviations from revealed preference restrictions are measured by preference perturbations which are shown to usefully characterize taste change and to provide a stochastic environment within which violations of revealed preference inequalities can be assessed.  相似文献   

13.
The paper presents a multi-phase approach for selecting a country in which to locate a global manufacturing facility. An influence diagram is used to frame the decision. A decision tree then analyzes uncertainties regarding cost and generates a risk profile. The risk profile becomes one of the measures in an MAUT model that incorporate a wide range of factors. This sequential approach of using the output from a decision tree as input to MAUT is demonstrated with an example involving an auto supplier locating a new plant in one of five countries. Three decision makers were interviewed to determine the weights and the shape of the individual utility curves. The paper identifies, clearly defines, and incorporates a variety of measures for which national data are readily available. This list is broader and less subjective when compared to other examples reported in the literature.  相似文献   

14.
In the evaluation of chemical compounds for carcinogenic risk, regulatory agencies such as the U.S. Environmental Protection Agency and National Toxicology Program (NTP) have traditionally fit a dose-response model to data from rodent bioassays, and then used the fitted model to estimate a Virtually Safe Dose or the dose corresponding to a very small increase (usually 10(-6)) in risk over background. Much recent interest has been directed at incorporating additional scientific information regarding the properties of the specific chemical under investigation into the risk assessment process, including biological mechanisms of cancer induction, metabolic pathways, and chemical structure and activity. Despite the fact that regulatory agencies are currently poised to allow use of nonlinear dose-response models based on the concept of an underlying threshold for nongenotoxic chemicals, there have been few attempts to investigate the overall relationship between the shape of dose-response curves and mutagenicity. Using data from an historical database of NTP cancer bioassays, the authors conducted a repeated-measures Analysis of the estimated shape from fitting extended Weibull dose-response curves. It was concluded that genotoxic chemicals have dose-response curves that are closer to linear than those for nongenotoxic chemicals, though on average, both types of compounds have dose-response curves that are convex and the effect of genotoxicity is small.  相似文献   

15.
A quantitative risk analysis (QRA) regarding dangerous goods vehicles (DGVs) running through road tunnels was set up. Peak hourly traffic volumes (VHP), percentage of heavy goods vehicles (HGVs), and failure of the emergency ventilation system were investigated in order to assess their impact on the risk level. The risk associated with an alternative route running completely in the open air and passing through a highly populated urban area was also evaluated. The results in terms of social risk, as F/N curves, show an increased risk level with an increase the VHP, the percentage of HGVs, and a failure of the emergency ventilation system. The risk curves of the tunnel investigated were found to lie both above and below those of the alternative route running in the open air depending on the type of dangerous goods transported. In particular, risk was found to be greater in the tunnel for two fire scenarios (no explosion). In contrast, the risk level for the exposed population was found to be greater for the alternative route in three possible accident scenarios associated with explosions and toxic releases. Therefore, one should be wary before stating that for the transport of dangerous products an itinerary running completely in the open air might be used if the latter passes through a populated area. The QRA may help decisionmakers both to implement additional safety measures and to understand whether to allow, forbid, or limit circulation of DGVs.  相似文献   

16.
17.
In this paper we review the use of tradeoff curves in the design of manufacturing systems that can be modeled as open queueing networks. We focus particularly on the tradeoff between expected work-in-process (or product leadtime) and capacity investment in job shops. We review the algorithms in the literature to derive tradeoff curves and illustrate their application in evaluating the efficiency of the system, in deciding how much capacity to have, how to allocate resources between the reduction of uncertainty and the introduction of new technologies, and how to assess the impact of changes in products throughput and product mix. The methodology is illustrated with an example derived from an actual application in the semiconductor industry.  相似文献   

18.
The qualitative and quantitative evaluation of risk in developmental toxicology has been discussed in several recent publications.(1–3) A number of issues still are to be resolved in this area. The qualitative evaluation and interpretation of end points in developmental toxicology depends on an understanding of the biological events leading to the end points observed, the relationships among end points, and their relationship to dose and to maternal toxicity. The interpretation of these end points is also affected by the statistical power of the experiments used for detecting the various end points observed. The quantitative risk assessment attempts to estimate human risk for developmental toxicity as a function of dose. The current approach is to apply safety (uncertainty) factors to die no observed effect level (NOEL). An alternative presented and discussed here is to model the experimental data and apply a safety factor to an estimated risk level to achieve an “acceptable” level of risk. In cases where the dose-response curves upward, this approach provides a conservative estimate of risk. This procedure does not preclude the existence of a threshold dose. More research is needed to develop appropriate dose-response models that can provide better estimates for low-dose extrapolation of developmental effects.  相似文献   

19.
20.
The devastating impact by Hurricane Sandy (2012) again showed New York City (NYC) is one of the most vulnerable cities to coastal flooding around the globe. The low‐lying areas in NYC can be flooded by nor'easter storms and North Atlantic hurricanes. The few studies that have estimated potential flood damage for NYC base their damage estimates on only a single, or a few, possible flood events. The objective of this study is to assess the full distribution of hurricane flood risk in NYC. This is done by calculating potential flood damage with a flood damage model that uses many possible storms and surge heights as input. These storms are representative for the low‐probability/high‐impact flood hazard faced by the city. Exceedance probability‐loss curves are constructed under different assumptions about the severity of flood damage. The estimated flood damage to buildings for NYC is between US$59 and 129 millions/year. The damage caused by a 1/100‐year storm surge is within a range of US$2 bn–5 bn, while this is between US$5 bn and 11 bn for a 1/500‐year storm surge. An analysis of flood risk in each of the five boroughs of NYC finds that Brooklyn and Queens are the most vulnerable to flooding. This study examines several uncertainties in the various steps of the risk analysis, which resulted in variations in flood damage estimations. These uncertainties include: the interpolation of flood depths; the use of different flood damage curves; and the influence of the spectra of characteristics of the simulated hurricanes.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号