首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   12篇
  免费   0篇
管理学   2篇
社会学   1篇
统计学   9篇
  2019年   1篇
  2018年   1篇
  2016年   1篇
  2013年   6篇
  2010年   1篇
  2008年   1篇
  2007年   1篇
排序方式: 共有12条查询结果,搜索用时 389 毫秒
1.
The nonparametric two-sample bootstrap is applied to computing uncertainties of measures in receiver operating characteristic (ROC) analysis on large datasets in areas such as biometrics, speaker recognition, etc. when the analytical method cannot be used. Its validation was studied by computing the standard errors of the area under ROC curve using the well-established analytical Mann–Whitney statistic method and also using the bootstrap. The analytical result is unique. The bootstrap results are expressed as a probability distribution due to its stochastic nature. The comparisons were carried out using relative errors and hypothesis testing. These match very well. This validation provides a sound foundation for such computations.  相似文献   
2.
For the most part, strategy, entrepreneurship, and innovation have been researched and practiced from a representational position. In this paper, we make a case for taking a performative turn. Strategists, entrepreneurs, and intrapreneurs are embedded-embodied actors who engage in material-discursive practices in their attempts at constituting phenomena. Overflows, which are inevitable given dispersion of agency, give rise to matters of concern for multiple stakeholder groups. Settlements between stakeholders are temporary, as phenomena will be de-constituted when constitutive arrangements change. Consequently, the projects and initiatives that strategists, entrepreneurs, and intrapreneurs undertake are best seen as ongoing journeys.  相似文献   
3.
The traditional method for estimating or predicting linear combinations of the fixed effects and realized values of the random effects in mixed linear models is first to estimate the variance components and then to proceed as if the estimated values of the variance components were the true values. This two-stage procedure gives unbiased estimators or predictors of the linear combinations provided the data vector is symmetrically distributed about its expected value and provided the variance component estimators are translation-invariant and are even functions of the data vector. The standard procedures for estimating the variance components yield even, translation-invariant estimators.  相似文献   
4.
For a knowledge‐ and skill‐centric organization, the process of knowledge management encompasses three important and closely related elements: (i) task assignments, (ii) knowledge acquisition through training, and (iii) maintaining a proper level of knowledge inventory among the existing workforce. Trade‐off on choices between profit maximization in the short run and agility and flexibility in the long term is a vexing problem in knowledge management. In this study, we examine the effects of different training strategies on short‐term operational efficiency and long‐term workforce flexibility. We address our research objective by developing a computational model for task and training assignment in a dynamic knowledge environment consisting of multiple distinct knowledge dimensions. Overall, we find that organizational slack is an important variable in determining the effectiveness of training strategies. Training strategies focused on the most recent skills are found to be the preferred option in most of the considered scenarios. Interestingly, increased efficiencies in training can actually create preference conflict between employees and the firm. Our findings indicate that firms facing longer knowledge life cycles, higher slack in workforce capacity, and better training efficiencies actually face more difficult challenges in knowledge management.  相似文献   
5.
Given a most believed value for a quantity together with upper and lower possible deviations from that value, a rectangular distribution might be used to represent state-of-knowledge about the quantity. If the deviations are themselves known by probability distributions, and the value conditioned on the deviations is rectangular, then the marginal distribution of the value is determined by the distributions of the deviations. Here we show under quite general conditions that conversely, given the marginal distribution, the distributions of the deviations are uniquely determined. The case in which the marginal distribution is trapezoidal is studied in some detail.  相似文献   
6.
When estimating in a practical situation, asymmetric loss functions are preferred over squared error loss functions, as the former is more appropriate than the latter in many estimation problems. We consider here the problem of fixed precision point estimation of a linear parametric function in beta for the multiple linear regression model using asymmetric loss functions. Due to the presence of nuissance parameters, the sample size for the estimation problem is not known beforehand and hence we take the recourse of adaptive multistage sampling methodologies. We discuss here some multistage sampling techniques and compare the performances of these methodologies using simulation runs. The implementation of the codes for our proposed models is accomplished utilizing MATLAB 7.0.1 program run on a Pentium IV machine. Finally, we highlight the significance of such asymmetric loss functions with few practical examples.  相似文献   
7.
ROC analysis involving two large datasets is an important method for analyzing statistics of interest for decision making of a classifier in many disciplines. And data dependency due to multiple use of the same subjects exists ubiquitously in order to generate more samples because of limited resources. Hence, a two-layer data structure is constructed and the nonparametric two-sample two-layer bootstrap is employed to estimate standard errors of statistics of interest derived from two sets of data, such as a weighted sum of two probabilities. In this article, to reduce the bootstrap variance and ensure the accuracy of computation, Monte Carlo studies of bootstrap variability were carried out to determine the appropriate number of bootstrap replications in ROC analysis with data dependency. It is suggested that with a tolerance 0.02 of the coefficient of variation, 2,000 bootstrap replications be appropriate under such circumstances.  相似文献   
8.
Consider a normal population with unknown mean μ and unknown variance σ2. We estimate μ under an asymmetric LINEX loss function such that the associated risk is bounded from above by a known quantity w. This necessitates the use of a random number (N) of observations. Under a fairly broad set of assumptions on N, we derive the asymptotic second-order expansion of the associated risk function. Some examples have been included involving accelerated sequential and three-stage sampling techniques. Performance comparisons of these procedures are considered using a Monte-Carlo study.  相似文献   
9.
10.
Materials formulation and processing research are important industrial processes and most materials know-how comes from physical experiments. Our impression, based on discussions with materials scientists, is that statistically planned experiments are infrequently used in materials research. This scientific and engineering area provides an excellent opportunity for both using the available techniques of statistically planned experiments, including mixture experiments, and identifying opportunities for collaborative research leading to further advances in statistical methods for scientists and engineers. This paper describes an application of SchefK's (1958) simplex approach for mixture experiments to formulation of high-temperature superconducting compounds. This example has given us better appreciation of the needs of materials scientists and has provided us opportunities for further collaborative research.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号