首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
The bullwhip effect describes the tendency for the variance of orders in supply chains to increase as one moves upstream from consumer demand. We report on a set of laboratory experiments with a serial supply chain that tests behavioral causes of this phenomenon, in particular the possible influence of coordination risk. Coordination risk exists when individuals' decisions contribute to a collective outcome and the decision rules followed by each individual are not known with certainty, for example, where managers cannot be sure how their supply chain partners will behave. We conjecture that the existence of coordination risk may contribute to bullwhip behavior. We test this conjecture by controlling for environmental factors that lead to coordination risk and find these controls lead to a significant reduction in order oscillations and amplification. Next, we investigate a managerial intervention to reduce the bullwhip effect, inspired by our conjecture that coordination risk contributes to bullwhip behavior. Although the intervention, holding additional on‐hand inventory, does not change the existence of coordination risk, it reduces order oscillation and amplification by providing a buffer against the endogenous risk of coordination failure. We conclude that the magnitude of the bullwhip can be mitigated, but that its behavioral causes appear robust.  相似文献   

2.
联合采购往往使订货批量成倍增加,从而更易享受供应商提供的价格折扣,因此联合采购受到零售商们的青睐。考虑由单供应商与多零售商组成的二级改良品供应链中,供应商对零售商提供非瞬时补货,分别建立零售商独立采购与联合采购的单位时间成本函数,求解出两种采购模式的最小单位时间成本并对之进行比较,得到联合采购优于独立采购的必要条件。同时,以联合采购的联盟成本作为分摊对象,应用多人合作博弈理论,将联合采购的成本分摊问题构造成多人合作博弈问题,给出最小核心法的成本分摊思路。通过数值算例演示成本分摊过程,给出净改良率对订货参数及成本参数的敏感性分析,并对四种成本分摊算法的分摊结果作出比较。  相似文献   

3.
In this paper I demonstrate how one can generalize finitely many examples to statements about (infinite) classes of economic models. If there exist upper bounds on the number of connected components of one‐dimensional linear subsets of the set of parameters for which a conjecture is true, one can conclude that it is correct for all parameter values in the class considered, except for a small residual set, once one has verified the conjecture for a predetermined finite set of points. I show how to apply this insight to computational experiments and spell out assumptions on the economic fundamentals that ensure that the necessary bounds on the number of connected components exist. I argue that these methods can be fruitfully utilized in applied general equilibrium analysis. I provide general assumptions on preferences and production sets that ensure that economic conjectures define sets with a bounded number of connected components. Using the theoretical results, I give an example of how one can explore qualitative and quantitative implications of general equilibrium models using computational experiments. Finally, I show how random algorithms can be used for generalizing examples in high‐dimensional problems.  相似文献   

4.
Properties of instrumental variable estimators are sensitive to the choice of valid instruments, even in large cross‐section applications. In this paper we address this problem by deriving simple mean‐square error criteria that can be minimized to choose the instrument set. We develop these criteria for two‐stage least squares (2SLS), limited information maximum likelihood (LIML), and a bias adjusted version of 2SLS (B2SLS). We give a theoretical derivation of the mean‐square error and show optimality. In Monte Carlo experiments we find that the instrument choice generally yields an improvement in performance. Also, in the Angrist and Krueger (1991) returns to education application, when the instrument set is chosen in the way we consider, it turns out that both 2SLS and LIML give similar (large) returns to education.  相似文献   

5.
In many problems of risk analysis, failure is equivalent to the event of a random risk factor exceeding a given threshold. Failure probabilities can be controlled if a decisionmaker is able to set the threshold at an appropriate level. This abstract situation applies, for example, to environmental risks with infrastructure controls; to supply chain risks with inventory controls; and to insurance solvency risks with capital controls. However, uncertainty around the distribution of the risk factor implies that parameter error will be present and the measures taken to control failure probabilities may not be effective. We show that parameter uncertainty increases the probability (understood as expected frequency) of failures. For a large class of loss distributions, arising from increasing transformations of location‐scale families (including the log‐normal, Weibull, and Pareto distributions), the article shows that failure probabilities can be exactly calculated, as they are independent of the true (but unknown) parameters. Hence it is possible to obtain an explicit measure of the effect of parameter uncertainty on failure probability. Failure probability can be controlled in two different ways: (1) by reducing the nominal required failure probability, depending on the size of the available data set, and (2) by modifying of the distribution itself that is used to calculate the risk control. Approach (1) corresponds to a frequentist/regulatory view of probability, while approach (2) is consistent with a Bayesian/personalistic view. We furthermore show that the two approaches are consistent in achieving the required failure probability. Finally, we briefly discuss the effects of data pooling and its systemic risk implications.  相似文献   

6.
We present a new biclustering algorithm to simultaneously discover tissue classes and identify a set of genes that well-characterize these classes from DNA microarray data sets. We employ a combinatorial optimization approach where the object is to simultaneously identify an interesting set of genes and a partition of the array samples that optimizes a certain score based on a novel color island statistic. While this optimization problem is NP-complete in general, we are effectively able to solve problems of interest to optimality using a branch-and-bound algorithm. We have tested the algorithm on a 30 sample Cutaneous T-cell Lymphoma data set; it was able to almost perfectly discriminate short-term survivors from long-term survivors and normal controls. Another useful feature of our method is that can easily handle missing expression data.  相似文献   

7.
Solving multicriteria decision making problems often requires the assessment of certain preferential information. In some occasions, this information must be given by several individuals or social groups, and these individual assessments need to be aggregated into single global preferences. This cardinal preferences aggregation problem has been tackled using different techniques, including multicriteria decision making ones. In this paper, a Meta-Goal Programming approach is proposed, where different target values can be set on several achievement functions that measure the goodness of the global assessments. This methodology presents strong advantages due to its modeling flexibility and its ability to find balanced solutions. The proposed approach is demonstrated with an illustrative example and a series of computational experiments, and it is shown that the Meta-Goal Programming method produces results with better values of the achievement functions than other classical and multicriteria approaches.  相似文献   

8.
信息技术战略价值及实现机制的实证研究   总被引:1,自引:0,他引:1  
为研究信息技术战略价值及其实现机制,基于企业资源观、竞争战略理论和核心能力理论,构建了信息技术资源、信息技术应用能力、战略层面的信息系统能力、环境动态性和企业绩效之间关系的研究模型,应用结构方程模型对233家中国企业的调查问卷进行数据分析和模型拟合.研究结果表明信息技术资源和信息技术应用能力都无法直接影响企业绩效,其战...  相似文献   

9.
Adrian Kent 《Risk analysis》2004,24(1):157-168
Recent articles by Busza et al. (BJSW) and Dar et al. (DDH) argue that astrophysical data can be used to establish small bounds on the risk of a "killer strangelet" catastrophe scenario in the RHIC and ALICE collider experiments. The case for the safety of the experiments set out by BJSW does not rely solely on these bounds, but on theoretical arguments, which BJSW find sufficiently compelling to firmly exclude any possibility of catastrophe. Nonetheless, DDH and other commentators (initially including BJSW) suggested that these empirical bounds alone do give sufficient reassurance. This seems unsupportable when the bounds are expressed in terms of expectation value-a good measure, according to standard risk analysis arguments. For example, DDH's main bound, p(catastrophe) < 2 x 10(-8), implies only that the expectation value of the number of deaths is bounded by 120; BJSW's most conservative bound implies the expectation value of the number of deaths is bounded by 60,000. This article reappraises the DDH and BJSW risk bounds by comparing risk policy in other areas. For example, it is noted that, even if highly risk-tolerant assumptions are made and no value is placed on the lives of future generations, a catastrophe risk no higher than approximately 10(-15) per year would be required for consistency with established policy for radiation hazard risk minimization. Allowing for risk aversion and for future lives, a respectable case can be made for requiring a bound many orders of magnitude smaller. In summary, the costs of small risks of catastrophe have been significantly underestimated by BJSW (initially), by DDH, and by other commentators. Future policy on catastrophe risks would be more rational, and more deserving of public trust, if acceptable risk bounds were generally agreed upon ahead of time and if serious research on whether those bounds could indeed be guaranteed was carried out well in advance of any hypothetically risky experiment, with the relevant debates involving experts with no stake in the experiments under consideration.  相似文献   

10.
Despite being theoretically suboptimal, simpler contracts (such as price‐only contracts and quantity discount contracts with limited number of price blocks) are commonly preferred in practice. Thus, exploring the tension between theory and practice regarding complexity and performance in contract design is especially relevant. Using human subject experiments, Kalkancı et al. (2011) showed that such simpler contracts perform effectively for a supplier interacting with a computerized buyer under asymmetric demand information. We use a similar set of experiments with the modification that a human supplier interacts with a human buyer. We show that human interactions strengthen the supplier's preference for simpler contracts. We find that suppliers have fairness concerns even when they interact with computerized buyers. These fairness concerns tend to be even stronger when suppliers interact with human buyers, particularly when the complexity of the contract is low. We also find that suppliers are more prone to random decision errors (i.e., bounded rationality) when interacting with human buyers. In the absence of social preferences, Kalkancı et al. identified reinforcement and bounded rationality as key biases that impact suppliers' decisions. In human‐to‐human experiments, we find evidence for social preference effects. However, these effects may be secondary to bounded rationality.  相似文献   

11.
组织内部要素与竞争优势的获取   总被引:21,自引:1,他引:20  
本文在已有的关于竞争优势获取途径研究的基础上着重考察了关键资源、核心竞争力和组织学习这三个组织内部要素与竞争优势的关系,说明了这三个要素在确立竞争优势过程中的不同的作用层次和机理。建立了一个简单的数学模型探讨了三个要素在确立竞争优势中的不同效率以及通过建立核心竞争力和开展组织学习所能获得的比较优势。  相似文献   

12.
Flexible work arrangements present managers with challenges regarding how to manage employees using these arrangements. To date, little research has investigated how managers address these challenges. We investigate the relationship between the use of a specific implementation of flexible work (teleworking) and control system design, specifically the emphasis on output controls. Teleworking reduces the feasibility of monitoring employee behaviour as a control mechanism. Control theory suggests that this might be compensated by placing more emphasis on output controls. We conduct a survey (N?=?897) among employees of a financial services institution, of whom 69% is allowed to telework. We find that among teleworking employees, the share of teleworking hours is positively related to the emphasis on output controls. However, employees who are allowed to telework report less emphasis on output controls by their manager relative to those not allowed to telework. We pose various directions for future research, which may help in explaining these findings.  相似文献   

13.
We consider a real world generalization of the 2-Dimensional Guillotine Cutting Stock Problem arising in the wooden board cutting industry. A set of rectangular items has to be cut from rectangular stock boards, available in multiple formats. In addition to the classical objective of trim loss minimization, the problem also asks for the maximization of the cutting equipment productivity, which can be obtained by cutting identical boards in parallel. We present several heuristic algorithms for the problem, explicitly considering the optimization of both objectives. The proposed methods, including fast heuristic algorithms, Integer Linear Programming models and a truncated Branch and Price algorithm, have increasing complexity and require increasing computational effort. Extensive computational experiments on a set of realistic instances from the industry show that high productivity of the cutting equipment can be obtained with a minimal increase in the total area of used boards. The experiments also show that the proposed algorithms perform extremely well when compared with four commercial software tools available for the solution of the problem.  相似文献   

14.
We consider the following optimization problem. There is a set of \(n\) dedicated jobs that are to be processed on \(m\) parallel machines. The job set is partitioned into subsets and jobs of each subset have a common due date. Processing times of jobs are interconnected and they are the subject of the decision making. The goal is to choose a processing time for each job in a feasible way and to construct a schedule that minimizes the maximum lateness. We show that the problem is NP-hard even if \(m=1\) and that it is NP-hard in the strong sense if \(m\) is a variable. We prove that there is no approximate polynomial algorithm with guaranteed approximation ratio less than 2. We propose an integer linear formulation for the problem and perform experiments. The experiments show that the solutions obtained with CPLEX within the limit of 5 min are on average about 5 % from the optimum value for instances with up to 150 jobs, 16 subsets and 11 machines. Most instances were solved to optimality and the average CPLEX running time was 32 s for these instances.  相似文献   

15.
Maurice H Peston 《Omega》1978,6(2):117-121
Control theory has important contributions to make to the theory of macro-economic policy. These include a unified view of estimation, forecasting and control and a consequently clearer focus in identifying performance criteria. Further, the methods of control theory imply objectivity and a need for assumptions to be made explicit. By contrast, the UK lacks a research body for forecasting and policy evaluation detached from doctrinal or methodological commitment. Engineering control methods, however, diverge from those wholly acceptable to economic systems in that, in the latter, the controller is himself part of the system he controls and some of his controls operate only indirectly. A further difference is that economics cannot as yet furnish the controller with reliable models of system behaviour linking policy instruments to set targets.  相似文献   

16.
It has been suggested that the motivation to spend effort is decreased in burnout patients, resulting in reduced cognitive performance. A question that remains is whether this decreased motivation can be reversed by motivational interventions. We investigated this by examining the effect of a motivational intervention on cognitive performance. We presented 40 burnout patients in The Netherlands and 40 matched healthy controls with a complex attention task. As expected, in a first block of trials the performance of the burnout patients was poorer than that of healthy controls. Subsequently, we provided the participants with fake positive feedback about their performance and announced that we would financially reward those who performed best in a subsequent block of trials. Contrary to the healthy controls, the burnout patients did not improve their performance and experienced more aversion to spend effort. The study demonstrated that impaired cognitive performance in burnout patients could not be reversed by motivational interventions, which is in line with contemporary theories on burnout that state that physiological changes in burnout may underlie a relatively long-term decrease in motivation. The implication of these results is that in practice employers and therapists might need to accept that there could be a reduction in cognitive performance in employees with burnout.  相似文献   

17.
We consider assortment problems under a mixture of multinomial logit models. There is a fixed revenue associated with each product. There are multiple customer types. Customers of different types choose according to different multinomial logit models whose parameters depend on the type of the customer. The goal is to find a set of products to offer so as to maximize the expected revenue obtained over all customer types. This assortment problem under the multinomial logit model with multiple customer types is NP‐complete. Although there are heuristics to find good assortments, it is difficult to verify the optimality gap of the heuristics. In this study, motivated by the difficulty of finding optimal solutions and verifying the optimality gap of heuristics, we develop an approach to construct an upper bound on the optimal expected revenue. Our approach can quickly provide upper bounds and these upper bounds can be quite tight. In our computational experiments, over a large set of randomly generated problem instances, the upper bounds provided by our approach deviate from the optimal expected revenues by 0.15% on average and by less than one percent in the worst case. By using our upper bounds, we are able to verify the optimality gaps of a greedy heuristic accurately, even when optimal solutions are not available.  相似文献   

18.
Criteria to protect aquatic life are intended to protect diverse ecosystems, but in practice are usually developed from compilations of single‐species toxicity tests using standard test organisms that were tested in laboratory environments. Species sensitivity distributions (SSDs) developed from these compilations are extrapolated to set aquatic ecosystem criteria. The protectiveness of the approach was critically reviewed with a chronic SSD for cadmium comprising 27 species within 21 genera. Within the data set, one genus had lower cadmium effects concentrations than the SSD fifth percentile‐based criterion, so in theory this genus, the amphipod Hyalella, could be lost or at least allowed some level of harm by this criteria approach. However, population matrix modeling projected only slightly increased extinction risks for a temperate Hyalella population under scenarios similar to the SSD fifth percentile criterion. The criterion value was further compared to cadmium effects concentrations in ecosystem experiments and field studies. Generally, few adverse effects were inferred from ecosystem experiments at concentrations less than the SSD fifth percentile criterion. Exceptions were behavioral impairments in simplified food web studies. No adverse effects were apparent in field studies under conditions that seldom exceeded the criterion. At concentrations greater than the SSD fifth percentile, the magnitudes of adverse effects in the field studies were roughly proportional to the laboratory‐based fraction of species with adverse effects in the SSD. Overall, the modeling and field validation comparisons of the chronic criterion values generally supported the relevance and protectiveness of the SSD fifth percentile approach with cadmium.  相似文献   

19.
The problem of equipment selection for a production line is considered. Each piece of equipment, also called unit or block, performs a set of operations. All necessary operations of the line and all available blocks with their costs are known. The difficulty is to choose the most appropriate blocks and group them into (work)stations. There are some constraints that restrict the assignment of different blocks to the same station. Two combinatorial approaches for solving this problem are suggested. Both are based on a novel concept of locally feasible stations. The first approach combinatorially enumerates all feasible solutions, and the second reduces the problem to search for a maximum weight clique. A boolean linear program based on a set packing formulation is presented. Computer experiments with benchmark data are described. Their results show that the set packing model is competitive and can be used to solve real-life problems.  相似文献   

20.
The criterion-related validity of constructed response measures of complex problem-solving skills, social judgment skills, and leader knowledge is examined with respect to two criteria of leader effectiveness: leader achievement and quality of solutions to ill-defined leadership problems. Core aspects of the leader capabilities model are tested using these measures in a series of hierarchical regression analyses. Results indicate that constructed response measures of key leader capabilities account for variance in leader effectiveness and provide initial validation evidence for a central part of the theoretical model. The problem-solving, social judgment and knowledge measures account for significant variance in leadership criteria beyond that accounted for by cognitive abilities, motivations, and personality. Initial evidence also suggests that complex problem-solving skills, social judgment and leader knowledge partially mediate the relationship of cognitive abilities, motivation and personality to leader effectiveness. Implications and generalizability of the results are discussed in light of a related civilian leadership study conducted within the U.S. Department of Defense.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号