This paper presents the perspectives of quantitative techniques in academics and practice. Based on the findings of an empirical study, the academicians and practitioners emphasize different techniques and prefer different journals for keeping abreast. This reveals the areas for curriculum improvement to orient the program toward the practitioners. 相似文献
This commentary evaluates the usefulness of the Freed and Glover [6] linear programming approach to the discriminant problem, relates linear programming to other parametric and nonparametric approaches, and evaluates the linear programming approach. 相似文献
This paper presents a new linear model methodology for clustering judges with homogeneous decision policies and differentiating dimensions which distinguish judgment policies. This linear policy capturing model based on canonical correlation analysis is compared to the standard model based on regression analysis and hierarchical agglomerative clustering. Potential advantages of the new methodology include simultaneous instead of sequential consideration of information in the dependent and independent variable sets, decreased interpretational difficulty in the presence of multicollinearity and/or suppressor/moderator variables, and a more clearly defined solution structure allowing assessment of a judge's relationship to all of the derived, ideal policy types. An application to capturing policies of information systems recruiters responsible for hiring entry-level personnel is used to compare and contrast the two techniques. 相似文献
Exploring organisations is the prerequisite for any intentional attempt to strategic change. Yet, what is it that we observe when we observe organisations? The argument chooses a narrative approach to exploring organisations. With Niklas Luhmann we look at the operations of organising which makes the organisation an organisation. The paper suggests the organisational collage (I.) of stories as a starting point of the exploration. The specific focus is on meaning-creation and sense-making as the genuine act of organisational self-observation. The disciplinary matrix (II.) reflects on how stories and narratives crystallise and rule the organisation in a paradigmatic way. Along Thomas Kuhn's understanding of paradigms (III.) management is referenced as an activity of a community of practice based on a disciplinary matrix of models, methods and instruments. Giorgio Agamben's conceptualisation of paradigms as reference giving examples allows the opening up of the implicit side of organisational culture. Memetics (V.) approaching reference giving examples as memes and culture as a meme-complex enable the observation of dynamics and cultural evolution over time. Concluding we come to understand the organisational implications (V.) of the conservative nature of organisational development and the systemic sensitivity that allows for management, learning and change. And as always, advances in research come at the price of new questions. 相似文献
It is essential to reduce data latency and guarantee quality of service for modern computer networks. The emerging networking protocol, Multipath Transmission Control Protocol, can reduce data latency by transmitting data through multiple minimal paths (MPs) and ensure data integrity by the packets retransmission mechanism. The bandwidth of each edge can be considered as multi-state in computer networks because different situations, such as failures, partial failures and maintenance, exist. We evaluate network reliability for a multi-state retransmission flow network through which the data can be successfully transmitted by means of multiple MPs under the time constraint. By generating all minimal bandwidth patterns, the proposed algorithm can satisfy these requirements to calculate network reliability. An example and a practical case of the Pan-European Research and Education Network are applied to demonstrate the proposed algorithm. 相似文献
Model averaging for dichotomous dose–response estimation is preferred to estimate the benchmark dose (BMD) from a single model, but challenges remain regarding implementing these methods for general analyses before model averaging is feasible to use in many risk assessment applications, and there is little work on Bayesian methods that include informative prior information for both the models and the parameters of the constituent models. This article introduces a novel approach that addresses many of the challenges seen while providing a fully Bayesian framework. Furthermore, in contrast to methods that use Monte Carlo Markov Chain, we approximate the posterior density using maximum a posteriori estimation. The approximation allows for an accurate and reproducible estimate while maintaining the speed of maximum likelihood, which is crucial in many applications such as processing massive high throughput data sets. We assess this method by applying it to empirical laboratory dose–response data and measuring the coverage of confidence limits for the BMD. We compare the coverage of this method to that of other approaches using the same set of models. Through the simulation study, the method is shown to be markedly superior to the traditional approach of selecting a single preferred model (e.g., from the U.S. EPA BMD software) for the analysis of dichotomous data and is comparable or superior to the other approaches. 相似文献
Suppose that a residential neighborhood may have been contaminated by a nearby abandoned hazardous waste site. The suspected contamination consists of elevated soil concentrations of chemicals that are also found in the absence of site-related contamination. How should a risk manager decide which residential properties to sample and which ones to clean? This paper introduces an adaptive spatial sampling approach which uses initial observations to guide subsequent search. Unlike some recent model-based spatial data analysis methods, it does not require any specific statistical model for the spatial distribution of hazards, but instead constructs an increasingly accurate nonparametric approximation to it as sampling proceeds. Possible cost-effective sampling and cleanup decision rules are described by decision parameters such as the number of randomly selected locations used to initialize the process, the number of highest-concentration locations searched around, the number of samples taken at each location, a stopping rule, and a remediation action threshold. These decision parameters are optimized by simulating the performance of each decision rule. The simulation is performed using the data collected so far to impute multiple probable values of unknown soil concentration distributions during each simulation run. This optimized adaptive spatial sampling technique has been applied to real data using error probabilities for wrongly cleaning or wrongly failing to clean each location (compared to the action that would be taken if perfect information were available) as evaluation criteria. It provides a practical approach for quantifying trade-offs between these different types of errors and expected cost. It also identifies strategies that are undominated with respect to all of these criteria.