首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 156 毫秒
1.
With the recent slowdown in productivity growth within the economy, R&D has come under scrutiny as a policy target variable. If such targeting is to be effective, it must be realized that not all innovations employed within a firm are induced by the firm through its own R&D: many innovations are purchased through technological licensing or in the form of new capital equipment. Here, interfirm differences in this “make” versus “buy” strategy are analyzed within the context of the Utterback-Abernathy production process lifecycle. Our findings suggest that (1) alternative sources to a firm's R&D for stimulating innovation may prove a viable strategy for federal targeting and (2) extrapolating the Utterback-Abernathy model to an industry formulation has empirical validity.  相似文献   

2.
Capital budgeting models for analyzing real assets typically are based on a set of restrictive assumptions that influence financial managers' decisions and may prevent optimization of the firm's objectives. This research examines the common restrictive assumption that cash flows are intertemporally independent by first developing an economic state and simulation model based on a Markov process for including autocorrelated cash flows in the capital budgeting decision process and then demonstrating why managers should include autocorrelated cash flows in capital budgeting models by empirically testing the impact of assuming intertemporally independent cash flows on capital budgeting decisions. The results indicate that ignoring autocorrelated cash flows seriously limits the ability of capital budgeting models to provide optimal investment decisions. The model also is very attractive for practical application because it can be implemented with a minimum number of estimates and provides the set of input data required by a number of capital budgeting models. A discussion of the implementation of the model is included.  相似文献   

3.
The common approach to balancing mixed-model assembly lines assumes that the line operators are well trained and that the learning effect is negligible. The assumption is that the line operates in steady state over a long period of time. Time-based competition and frequent design changes in many products make this assumption incorrect, and the effect of learning on mixed-model lines should not be neglected. We defined start-up period and developed a model for the line design during start-up. It can be used to evaluate a proposed line design or to develop a feasible line design and to estimate its cost. This proposed model integrates mixed-model learning curves with aggregate planning under learning and a mixed-model line design into a comprehensive framework designed to minimize the total cost of the line during the start-up period.  相似文献   

4.
The complexity of interdependent structural systems greatly complicates the analysis of any single structure. This is particularly the case when a structure represents some behavioral process. For this reason it is necessary to devise measures which can differentiate qualitatively and quantitatively between structures as well as between subsets (or points) of a particular structure. For example, consider the authority structures of two different organizations. They exhibit similarities and differences which a behavioral analyst tries to identify and explain. Typically, both similarities and differences are compared by structural indices which, on the basis of past data and prior information, tend to reflect certain organizational traits. The purpose of this paper is to investigate one particularly important index—centrality. Centrality conveys the notion that points in a structure are not all ‘equal’. This ‘inequality’ vis-a-vis the structure creates a situation in which certain points will be more ‘central’ than others. In this paper we first identify the characteristics of centrality and observe how they may relate to behavioral research. We then develop a procedure for measuring centrality which is based on information theory.  相似文献   

5.
This paper analyzes a framework in which countries over time pollute and invest in green technologies. Without a climate treaty, the countries pollute too much and invest too little, particularly if intellectual property rights are weak. Nevertheless, short‐term agreements on emission levels then reduce every country's payoff, since countries invest less when they anticipate future negotiations. If intellectual property rights are weak, the agreement should be tougher and more long‐term. Conversely, if the climate agreement happens to be short‐term or absent, intellectual property rights should be strengthened or technological licensing subsidized.  相似文献   

6.
In this paper we review the literature on appointment policies, specifically in terms of the objective function commonly used and the assumptions made about the behavior of demand. First, we provide an economic framework to analyze the problem. Based on this framework we make a critical analysis of the objective functions used in the literature. We also question the validity of the assumption made throughout the literature that demand is exogenous and independent of customers' waiting times. We conclude that the objective functions used in the literature are appropriate only in the case of a central planner facing a demand that is unresponsive to waiting time. For other scenarios, such as a private server facing a demand that does react to waiting time, these objective functions are only shortcuts for the real objective functions that must be used. A more general model is then proposed that fits these scenarios well. Finally, we determine the impact of using the literature's objective functions on optimal appointment policies.  相似文献   

7.
The purpose of this research is to show the usefulness of three relatively simple nonlinear classification techniques for policy-capturing research where linear models have typically been used. This study uses 480 cases to assess the decision-making process used by 24 experienced national bank examiners in classifying commercial loans as acceptable or questionable. The results from multiple discriminant analysis (a linear technique) are compared to those of chi-squared automatic interaction detector analysis (a search technique), log-linear analysis, and logit analysis. Results show that while the four techniques are equally accurate in predicting loan classification, chi-squared automatic interaction detector analysis (CHAID) and log-linear analysis enable the researcher to analyze the decision-making structure and examine the “human” variable within the decision-making process. Consequently, if the sole purpose of research is to predict the decision maker's decisions, then any one of the four techniques turns out to be equally useful. If, however, the purpose is to analyze the decision-making process as well as to predict decisions, then CHAID or log-linear techniques are more useful than linear model techniques.  相似文献   

8.
The importance of evaluating the effectiveness of the purchasing function in firms along multiple criteria has attracted considerable attention. However, few studies have identified the defining elements that constitute purchasing competence. This paper introduces the construct of purchasing competence using a second‐order factor structure derived from purchasing practices identified from the literature. The validity of the construct (purchasing competence) is tested using data from a sample of 179 firms. The results indicate (1) the construct validity of purchasing competence and (2) the predictive validity of purchasing competence, which has a significant positive influence on total quality management performance and customer satisfaction. The implications of these findings for additional research are discussed.  相似文献   

9.
Decision analysis tools often are used in semistructured and ill-structured situations. While some researchers have argued that computerized decision analysis programs may improve decision quality in such situations, research support for this assertion is weak. In this laboratory study, business students used a computerized decision-analysis program with short strategic-management cases to prepare decision reports. Independent raters' perceptions of aided and unaided decision performance were contrasted, attitudes of students toward the program were assessed, individual differences were correlated with attitudes, and the program's impact on students' perceptions of their decision processes and performance was examined. Student attitudes toward the computerized aid were favorable, and individual differences in reactions generally were as predicted. However, the program did not improve the independent ratings of students' decision reports and recommendations. These findings suggest that computerized decision aids should be adopted cautiously. If such aids result in positive user affect and heightened decision confidence without corresponding improvements in decision quality, they may be dysfunctional. Rigorous research methodologies which incorporate independent evaluations of analyses and decisions are recommended for use in future research on computerized decision-analysis programs.  相似文献   

10.
If resources and facilities from different partners need to be engaged for a large-scale project with a huge number of tasks, any of which is indivisible, decision on the number of tasks assigned to any collaborating partner often requires a certain amount of coordination and bargaining among these partners so that the ultimate task allocation can be accepted by any partner in a business union for the project. In the current global financial crisis, such cases may appear frequently. In this paper, we first investigate the behavior of such a discrete bargaining model often faced by service-based organizations. In particular, we address the general situation of two partners, where the finite Pareto efficient (profit allocation) set does not possess any convenient assumption for deriving a bargaining solution, namely a final profit allocation (corresponding to a task assignment) acceptable to both partners. We show that it is not appropriate for our discrete bargaining model to offer the union only one profit allocation. Modifying the original optimization problem used to derive the Nash Bargaining Solution (NBS), we develop a bargaining mechanism and define a related bargaining solution set to fulfil one type of needs on balance between profit-earning efficiency and profit-earning fairness. We then show that our mechanism can also suit both Nash’s original concave bargaining model and its continuous extension without the concavity of Pareto efficient frontier on profit allocation.  相似文献   

11.
In this paper linear production functions with highly correlated independent variables are estimated. Because many production processes require the use of inputs in fixed proportions, multicolinearity is usually a serious problem. This is usually the case when one attempts to estimate a linear or Cobb-Douglass (in log linear form) production function. Estimates of the marginal products of a linear production function are obtained by employing the first order conditions of the output maximizing solution given a cost constraint. Parameter estimates are determined for the case where the input costs are constants. The general case, where the input prices are functions of utilization levels, is delineated.  相似文献   

12.
A model of a production process, using an unscheduled set-up policy and utilizing fraction-defective control charts to control current production is developed taking into consideration all the costs; namely cost of sampling, cost of not detecting a change in the process, cost of a false indication of change, and the cost of re-adjusting detected changes. The model is based on the concept of the expected time between detection of changes calling for set-ups. It is shown that the combination of unscheduled set-ups and control charts can be utilized in an optimal way if those combinations of sample size, sampling interval and extent of control limits from process average will be used that provide the minimum expected total cost per unit of time. The costs when a production process with unscheduled set-up is controlled by using the appropriate optimal control charts is compared to the cost of a production process using scheduled set-ups at optimum intervals in conjunction with its appropriate control charts. This comparison indicates the criteria for selecting production processes with scheduled set-ups using optimal set-up intervals over unscheduled set-ups. Suggestions are made to evaluate the optimal process set-up strategy and the accompanying decision parameters, for any specific cost data, by use of computer enumeration.  相似文献   

13.
AL Soyster  HD Sherali 《Omega》1981,9(4):381-388
Many of the contemporary models used to describe the behavior of the mineral industries assume a competitive market i.e. one in which market price is equal to marginal production cost. One such recent model of the worldwide copper industry is the MIDAS-II model developed for the Bureau of Mines [3, 4]. This model is used to project production and prices up through the year 2000. The purpose of this paper is to demonstrate the importance of the assumed market structure in the construction of these forecasts. If the market structure of the US copper industry is assumed to be comprised of a few large firms (an oligopoly), then forecasts based upon exactly the same data base differ significantly from the competitive market assumption.  相似文献   

14.
Updating production plans typically is achieved by rolling the planning horizon forward one period at a time, each time including the latest information in order to determine the best course of action to pursue in the present period. Theoretical planning-horizon studies have identified the conditions by which the production decisions in the current and some specified number of future periods remain optimal given some set of future demands. Motivated by these findings, this study addresses the replanning frequency in a hierarchical production planning problem where no planning-horizon theorems are available. In this problem the aggregate production plan and the master production schedule are linked by a rolling-horizon practice. Empirical experimentation indicates that under certain cost and demand conditions the master production schedule need not be updated every period. If a schedule does not need to be updated for several periods, the schedule for these periods can be frozen to provide stability for planning components at lower levels in the bill of material of the products. The results of this study thus provide some reference for the determination of the frozen portion of the master production schedule.  相似文献   

15.
Research relating to sequencing rules in simple job shops has proliferated, but there has not been a corresponding proliferation of research evaluating similar sequencing rules in more complex assembly job shops. In a simple job shop, all operations are performed serially; but an assembly shop encompasses both serial and parallel operations. As a result of the increased complexity of assembly shops, the results associated with the performance of sequencing rules in simple job shops cannot be expected for an assembly shop. In this paper, 11 sequencing rules (some of which are common to simple job shops and some decigned specifically for assembly shops) are evaluated using a simulation analysis of a hypothetical assembly shop. The simulation results are analyzed using an ANOVA procedure that identifies significant differences in the results of several performance measures. Sensitivity analysis also is performed to determine the effect of job structure on the performance of the sequencing rules.  相似文献   

16.
This article summarizes the application of a forecasting model. Forecasts are made of monthly sales of products which do not change in style on an annual basis. The model is an exponential smoothing model. Adjustments of the parameters of the model are made whenever the average forecast error over the previous four periods is too large to be explained solely by unassignable causes. The efficiency gained in using the model is measured by the ratio of the standard deviation of the forecast errors to the standard deviation of sales. If this ratio is less than one, then the safety stock level that is carried for a given product can be reduced if sales are forecasted with the model and the standard deviation of the forecast errors is used to determine the safety stock level. The net effect is the reduction in the cost of carrying safety stocks. The results of the proposed model are also compared to a similar set of results generated from a basic, exponential model.  相似文献   

17.
The Campbell and Fiske criteria for assessing the construct validity of multitrait-multimethod (MTMM) matrices has had a long history of use. While various statistical techniques, including ANOVA, have attempted to provide rigor to the MTMM matrix design, numerous problems still remain unsolved. Part of the problem in using an MTMM matrix is the assumption of measurement independence. This study attempts to illustrate the misleading inferences that often occur from MTMM analysis when method variance overlap is not accurately assessed. Three questionnaires were designed that were not method independent. Traditional procedures for assessing MTMM matrices suggested the three scaling formats used were not burdened with unusual method variance. A reanalysis of the MTMM matrix employing a Confirmatory Factor Analysis technique illustrated that method variance was a problem. Finally, the need for studies that concentrate on the nature of method variance, its causes and effects, is discussed.  相似文献   

18.
This paper presents the results of a natural experiment conducted at a U.S. high‐tech manufacturer. The experiment had as its treatment the adoption, at a single point in time, of a comprehensive enterprise information system throughout the functional groups charged with customer order fulfillment. This information technology (it) adoption was not accompanied by substantial contemporaneous business process changes. Immediately after adoption, lead time and on‐time delivery performance suffered, causing a “performance dip” similar to those observed after the introduction of capital equipment onto shop floors. Lead times and on‐time delivery percentages then improved along a learning curve. After several months, performance in these areas improved significantly relative to preadoption levels. These observed performance patterns could not be well explained by rival causal factors such as order, production, and inventory volumes; head count; and new product introductions. Thus, this longitudinal research presents initial evidence of a causal link between IT adoption and subsequent improvement in operational performance measures, as well as evidence of the timescale over which these benefits appear.  相似文献   

19.
The classical Bagehot conception of a Lender of Last Resort (LOLR) that lends to illiquid banks has been criticized on two grounds: On the one hand, the distinction between insolvency and illiquidity is not clear‐cut; on the other, a fully collateralized repo market allows central banks to provide the adequate aggregate amount of liquidity and leave to the banks the responsibility of lending uncollateralized. The object of this paper is to analyze these issues rigorously by providing a framework in which liquidity shocks cannot be distinguished from solvency ones and then asking whether there is a need for a LOLR and how it should operate in the absence of systemic threats. Determining the optimal LOLR policy requires a careful modeling of the structure of the interbank market and of the closure policy. In our setup, the results depend upon the existence of moral hazard. If the main source of moral hazard is the banks' lack of incentives to screen loans, then the LOLR may have to intervene to improve the efficiency of an unsecured interbank market in crisis periods; if instead the main source of moral hazard is loan monitoring, then the interbank market should be secured and the LOLR should never intervene. (JEL: E58, 628)  相似文献   

20.
Since Eli Goldratt first appeared on the scene in the late 1970s, his ideas concerning production management have generated a huge amount of interest, controversy, and misunderstanding. These ideas have been proliferated under several names such as optimized production technology (OPT), drum-buffer-rope (DBR), synchronized manufacturing (SM), and theory of constraints (TOC). Although there seems to be general agreement on the importance of how capacity-constrained resources are scheduled, research aimed at advancing the state of the art for the specific problem addressed by DBR continues to be limited by prior misunderstandings and the lack of a rigorous examination by the academic community. This paper seeks “to advance the state of research on constraint scheduling in several ways. First, it presents a concise history of the evolution of DBR. It then explains the use of rods in constraint scheduling. Next, it presents in detail the solution algorithm incorporated by the Goldratt Institute in their production software and, finally, relates that algorithm to alternative methods. In the process of these activities, several lingering misconceptions are resolved.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号