首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 21 毫秒
1.
Multiattribute utility theory (MAUT) was employed to model the professional judgments of external auditors. Fully developed MAUT models elicited from each subject according to keeney and Raiffa's [6] procedures were used to predict the internal control systems evaluations made by auditor-subjects. Correlation analyses were used to compare the predictive ability of the “correct” MAUT models to the accuracy of models developed under simplifications of the MAUT procedures. One simplified model resulted from relaxing the requirements for attribute independence that determine the functional forms. A second modified MAUT function was formed using unitary weightings on conditional utility functions instead of elicited scaling constants. Tests showed essentially no significant differences in predictive accuracy among the models in the contact of this study.  相似文献   

2.
This paper reports a field experiment which compared two approaches to decision analysis, called cases and criteria scaling, in terms of their acceptance to users and their predictive features. The case method simulates decisions likely to occur in practice. Each decision is described by particular values of the decision criteria. Experts consider the values of criteria present in each decision and make a judgment. Regression is used to correlate the criteria values with the judgments. The regression equation provides the prediction model. The criteria scaling method decomposes the decision task. Experts weight each criterion and specify how decisions change across levels of each criterion. The predictive model is made up of the sum of each criterion weight multiplied by the criterion-predictor functional relationship. Both methods were applied to build models which predicted the demands for nursing time, using patient severity indicators, for two hospital units. The case method had considerable predictive accuracy and had favorable participant reactions. Predictions made by criteria scaling overestimated needs and this method was viewed as inaccurate and hard to understand by participants. The implications of these findings are discussed.  相似文献   

3.
In this article, we develop statistical models to predict the number and geographic distribution of fires caused by earthquake ground motion and tsunami inundation in Japan. Using new, uniquely large, and consistent data sets from the 2011 Tōhoku earthquake and tsunami, we fitted three types of models—generalized linear models (GLMs), generalized additive models (GAMs), and boosted regression trees (BRTs). This is the first time the latter two have been used in this application. A simple conceptual framework guided identification of candidate covariates. Models were then compared based on their out‐of‐sample predictive power, goodness of fit to the data, ease of implementation, and relative importance of the framework concepts. For the ground motion data set, we recommend a Poisson GAM; for the tsunami data set, a negative binomial (NB) GLM or NB GAM. The best models generate out‐of‐sample predictions of the total number of ignitions in the region within one or two. Prefecture‐level prediction errors average approximately three. All models demonstrate predictive power far superior to four from the literature that were also tested. A nonlinear relationship is apparent between ignitions and ground motion, so for GLMs, which assume a linear response‐covariate relationship, instrumental intensity was the preferred ground motion covariate because it captures part of that nonlinearity. Measures of commercial exposure were preferred over measures of residential exposure for both ground motion and tsunami ignition models. This may vary in other regions, but nevertheless highlights the value of testing alternative measures for each concept. Models with the best predictive power included two or three covariates.  相似文献   

4.
A hybrid genetic algorithm for production and distribution   总被引:1,自引:0,他引:1  
《Omega》2005,33(4):345-355
This paper develops a hybrid genetic algorithm for production and distribution problems in multi-factory supply chain models. Supply chain problems usually may involve multi-criterion decision-making, for example operating cost, service level, resources utilization, etc. These criteria are numerous and interrelated. To organize them, analytic hierarchy process (AHP) will be utilized. It provides a systematic approach for decision makers to assign weightings and relate them. Meanwhile, genetic algorithms (GAs) will be utilized to determine jobs allocation into suitable production plants. Genetic operators adopted to improve the genetic search algorithm will be introduced and discussed. Finally, a hypothetical production–distribution problem will be solved by the proposed algorithm. The optimization results show that it is reliable and robust.  相似文献   

5.
Steven M. Quiring 《Risk analysis》2011,31(12):1897-1906
This article compares statistical methods for modeling power outage durations during hurricanes and examines the predictive accuracy of these methods. Being able to make accurate predictions of power outage durations is valuable because the information can be used by utility companies to plan their restoration efforts more efficiently. This information can also help inform customers and public agencies of the expected outage times, enabling better collective response planning, and coordination of restoration efforts for other critical infrastructures that depend on electricity. In the long run, outage duration estimates for future storm scenarios may help utilities and public agencies better allocate risk management resources to balance the disruption from hurricanes with the cost of hardening power systems. We compare the out‐of‐sample predictive accuracy of five distinct statistical models for estimating power outage duration times caused by Hurricane Ivan in 2004. The methods compared include both regression models (accelerated failure time (AFT) and Cox proportional hazard models (Cox PH)) and data mining techniques (regression trees, Bayesian additive regression trees (BART), and multivariate additive regression splines). We then validate our models against two other hurricanes. Our results indicate that BART yields the best prediction accuracy and that it is possible to predict outage durations with reasonable accuracy.  相似文献   

6.
Research on the relationship between psychopathy and leadership effectiveness has adopted very different perspectives on psychopathy. To advance this field of research, the current paper introduces an overarching framework of “successful psychopathy” (Lilienfeld, Watts, & Smith, 2015) to the leadership domain, comprising three conceptual models (the differential-severity model, the moderated-expression model, and the differential-configuration model) and their “hybrid” forms, which are combinations of two or three models. We test the three alternative conceptual models and four hybrid models in two independent samples of leader-subordinate dyads (N1 = 178 and N2 = 668) whereby leaders’ self-reported psychopathy is related to a range of subordinate-rated effectiveness criteria, including three performance dimensions and charismatic leadership. A recurrent pattern of findings across both studies provides evidence for differential effects for the various psychopathy subdimensions, whereas little support was found for the models assuming curvilinear and/or moderated effects. Implications for research on leader psychopathy are discussed.  相似文献   

7.
This study compares the performance of one constant model with two adaptive models. MAD and bias were used as criteria for evaluating performance. The results of testing the adaptive models with the constant model were contrary to the expected better performance of the adaptive models; no significant differences were found between the models over known horizontal, trend, and seasonal demand patterns. However, some of the practical advantages of adaptive forecasting models in business suggest their continued use.  相似文献   

8.
Contingency models of information systems planning predict that no single planning approach will suit all organizations' needs. Little empirical research has been undertaken, however, to evaluate this prediction. Accordingly, we used McFarlan, McKenney, and Pyburn's (1983) strategic-grid model to study the information systems planning problems encountered by 49 governmental agencies. Twenty-seven agencies were required to follow a planning approach best suited to organizations that had a high level of dependence on both their existing and proposed systems. We predicted that agencies not having these characteristics would encounter the most problems with the approach. The remaining 22 agencies could choose their own planning approach. We studied this latter group to determine whether the problems encountered by the first group could be attributed to the mandated approach. Overall, the empirical results obtained were equivocal. Some results indicated that more planning problems were encountered by agencies in which the mandated approach was not appropriate to their position in the strategic grid. Other results were not supportive of this proposition. More work needs to be undertaken, therefore, to evaluate the predictive and explanatory power of contingency models of information systems planning. In addition, our research indicates a need to develop more rigorous theories of information systems planning behaviors, to improve the instruments needed to measure these behaviors, to explore the relationship between information systems planning behaviours and organizational effectiveness, to investigate how organizational learning impacts planning behaviors, and to determine the types of information systems planning problems that diffuse through organizations and those that remain localized.  相似文献   

9.
There is increasing evidence to support the predictive power of social epidemiological models such as Effort-Reward Imbalance (Siegrist, 1996) and the Job-Strain Model (Karasek, and Theorell, 1990) for explaining occupational stress, although it has been suggested that the models may have distinctive contributions towards explaining work stress in specific work settings. Alternatively, it has been suggested that the explanatory power of the different models might be enhanced if they were combined. The aim of this paper is to explore these questions by examining the power of the two different models both separately and in combination for explaining job satisfaction and mental distress in general medical practice. This analysis was based on data collected from a postal survey of the members of staff (N=1089, response rate = 70%) of 81 practices, which were randomly selected from all general practices in the National Health Service Executive South East region. The results show that while both models were predictors of mental distress and job satisfaction the models that combined different dimensions were the strongest predictors.  相似文献   

10.
This paper uses two recently developed tests to identify neglected nonlinearity in the relationship between excess returns on four asset classes and several economic and financial variables. Having found some evidence of possible nonlinearity, it was then investigated whether the predictive power of these variables could be enhanced by using neural network models instead of linear regression or GARCH models. Some evidence of nonlinearity in the relationships between the explanatory variables and large stocks and corporate bonds was found. It was also found that the GARCH models are conditionally efficient with respect to neural network models, but the neural network models outperform GARCH models if financial performance measures are used. In resonance with the results reported for the tests for neglected nonlinearity, it was found that the neural network forecasts are conditionally efficient with respect to linear regression models for large stocks and corporate bonds, whereas the evidence is not statistically significant for small stocks and intermediate-term government bonds. This difference persists even when financial performance measures for individual asset classes are used for comparison.  相似文献   

11.

There is increasing evidence to support the predictive power of social epidemiological models such as Effort-Reward Imbalance (Siegrist, 1996) and the Job-Strain Model (Karasek, and Theorell, 1990) for explaining occupational stress, although it has been suggested that the models may have distinctive contributions towards explaining work stress in specific work settings. Alternatively, it has been suggested that the explanatory power of the different models might be enhanced if they were combined. The aim of this paper is to explore these questions by examining the power of the two different models both separately and in combination for explaining job satisfaction and mental distress in general medical practice. This analysis was based on data collected from a postal survey of the members of staff (N=1089, response rate = 70%) of 81 practices, which were randomly selected from all general practices in the National Health Service Executive South East region. The results show that while both models were predictors of mental distress and job satisfaction the models that combined different dimensions were the strongest predictors.  相似文献   

12.
JM Dixie 《Omega》1974,2(3):415-419
A company bidding by sealed tender needs to know the relationship between their bid price and their chances of winning the contract. Previously published models for computing the probability of winning are examined and found to be inaccurate. The problem is reformulated, and a new general predictive model for computing the probability of winning is developed. The method of computation is illustrated by a simple worked example.  相似文献   

13.
The objective of this study is to extend previous research on total quality management (TQM)-context-performance relationships and ‘fit’ using multiple methods. We combine artificial neural networks (ANNs) with structural equation modelling (SEM) to analyse several hypotheses and propositions. This is the first study in this area of research that utilises ANNs and a triangulation technique in the presence of several contextual factors. The SEM analyses suggest that company size and industry type may have contingency effects on some of the TQM practices and/or TQM-performance relationships. However, the ANN models have shown that these two contingency factors do not moderate TQM outcomes, implying that all organisations can benefit from TQM regardless of size and type. As well, these models show that formal TQM implementation and/or ISO certifications do not add any predictive power to the ANN models except in one case: TQM implementation and/or ISO certification added to organisational effectiveness and customer results to predict financial and market (F&M) results. The results further indicate that even though implementing TQM alone has a bigger impact on F&M results than obtaining ISO certification alone, combining the two will have an even greater impact on these results. Joint implementation leads to greater improvements in organisational effectiveness, which, in turn, has a positive effect on customer results and consequently F&M results. This is a unique finding within the context of moderator effects on TQM-performance relationships.  相似文献   

14.
John Fripp 《Omega》1985,13(1):19-28
This article discusses a number of common interpretations of ‘implementation’ in the literature, and how these have been confused. Various different levels of implementation are then discussed, including the concept of model effectiveness. This is concerned with both the operational effectiveness of the model, as shown by a tangible improvement in the system modelled, and the personal effectiveness in helping users learn more about the system itself. Research is then described to explore various aspects of model usage and effectiveness. The research took place in the context of a business game used by a large number of practising managers. An unusual feature of the game was the fact that participants were offered the use of a number of models to aid their decisions. The usage of the models, and their effects, were measured objectively. Results showed that models were used extensively, and their effects were advantageous in a number of ways. The implications of this work are discussed.  相似文献   

15.
The paper analyzes the impact of the initial condition on the problem of testing for unit roots. To this end, we derive a family of optimal tests that maximize a weighted average power criterion with respect to the initial condition. We then investigate the relationship of this optimal family to popular tests. We find that many unit root tests are closely related to specific members of the optimal family, but the corresponding members employ very different weightings for the initial condition. The popular Dickey‐Fuller tests, for instance, put a large weight on extreme deviations of the initial observation from the deterministic component, whereas other popular tests put more weight on moderate deviations. Since the power of unit root tests varies dramatically with the initial condition, this paper explains the results of comparative power studies of unit root tests. The results allow a much deeper understanding of the merits of particular tests in specific circumstances, and a guide to choosing which statistics to use in practice.  相似文献   

16.
In this article, we discuss an outage‐forecasting model that we have developed. This model uses very few input variables to estimate hurricane‐induced outages prior to landfall with great predictive accuracy. We also show the results for a series of simpler models that use only publicly available data and can still estimate outages with reasonable accuracy. The intended users of these models are emergency response planners within power utilities and related government agencies. We developed our models based on the method of random forest, using data from a power distribution system serving two states in the Gulf Coast region of the United States. We also show that estimates of system reliability based on wind speed alone are not sufficient for adequately capturing the reliability of system components. We demonstrate that a multivariate approach can produce more accurate power outage predictions.  相似文献   

17.
本文基于C_TMPV理论估计已实现波动率的跳跃成分,在此基础上构建考虑跳跃的AHAR-RV-CJ模型和MIDAS-RV-CJ模型来预测中国股市的已实现波动率,并评价和比较各类波动率模型的预测精度。实证结果表明:基于C_TMPV估计的波动率跳跃成分对日、周以及月波动率的预测有显著的正向影响;AHAR-RV-CJ模型和MIDAS-RV-CJ模型的样本内和样本外预测精度在不同的预测时域上都是最高的,尤其是对数形式的模型;MIDAS族模型的样本外预测精度在中长期预测时域上比HAR族模型高;AHAR-RV-CJ模型和MIDAS-RV-CJ模型的样本外预测能力在中长期预测时域上比基于低频数据的Jump-GARCH模型、SV-CJ模型和SV-IJ模型好。  相似文献   

18.
Identity leadership theorizing suggests that leadership effectiveness derives from a potential leader’s perceived ability to create, embody, promote, and embed a shared group identity. However, little is known about how people integrate this information to form a judgment of a leader. We use cognitive modeling to operationalize leadership judgments as exemplar-and prototype-based categorization processes. Analysis of attribute rating data for 80 highly recognizable Americans revealed that leadership judgments were well-characterized by an exemplar-based model. Judgments were based overwhelmingly on promoting shared collective interests and embedding group identity. The pattern of attribute weightings was consistent for judgments of a general leadership role (i.e., as a competent leader) as well as judgments for a specific leadership role (i.e., as an effective US president). We discuss the implications of these findings for our understanding of identity leadership as well as for integrated social-cognitive models of individuals’ judgements of and responses to leaders.  相似文献   

19.
20.
This paper describes a structured methodology for decomposing the conceptual design problem in order to facilitate the design process and result in improved conceptual designs that better satisfy the original customer requirements. The axiomatic decomposition for conceptual design method combines Alexander's network partitioning formulation of the design problem with Suh's Independence Axiom. The axiomatic decomposition method uses a cross‐domain approach in a House of Quality context to estimate the interactions among the functional requirements that are derived from a qualitative assessment of customer requirements. These interactions are used in several objective functions that serve as criteria for decomposing the design network. A new network partitioning algorithm is effective in creating partitions that maximize the within‐partition interactions and minimize the between‐partition interactions with appropriate weightings. The viability, usability, and value of the axiomatic decomposition method were examined through analytic comparisons and qualitative assessments of its application. The new method was examined using students in engineering design capstone courses and it was found to be useable and did produce better product designs that met the customer requirements. The student‐based assessment revealed that the process would be more effective with individuals having design experience. In a subsequent assessment with practicing industrial designers, it was found that the new method did facilitate the development of better designs. An important observation was the need for limits on partition size (maximum of four functional requirements.) Another issue identified for future research was the need for a means to identify the appropriate starting partition for initiating the design.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号