首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
The purpose of this article is to strengthen the understanding of the relationship between a fixed-blocks and random-blocks analysis in models that do not include interactions between treatments and blocks. Treating the block effects as random has been recommended in the literature for balanced incomplete block designs (BIBD) because it results in smaller variances of treatment contrasts. This reduction in variance is large if the block-to-block variation relative to the total variation is small. However, this analysis is also more complicated because it results in a subjective interpretation of results if the block variance component is non-positive. The probability of a non-positive variance component is large precisely in those situations where a random-blocks analysis is useful – that is, when the block-to-block variation, relative to the total variation, is small. In contrast, the analysis in which the block effects are fixed is computationally simpler and less subjective. The loss in power for some BIBD with a fixed effects analysis is trivial. In such cases, we recommend treating the block effects as fixed. For response surface experiments designed in blocks, however, an opposite recommendation is made. When block effects are fixed, the variance of the estimated response surface is not uniquely estimated, and in practice this variance is obtained by ignoring the block effect. It is argued that a more reasonable approach is to treat the block effects to be random than to ignore it.  相似文献   

2.
The traditional Cramér–von Mises criterion is used in order to develop a test to compare the equality of the underlying lifetime distributions in the presence of independent censoring times. Its asymptotic distribution is proved and a resampling plan, which is valid for unbalanced data situations, is proposed. Its statistical power is studied and compared with commonly used linear rank tests by Monte Carlo simulations and a real data analysis is also considered. It is observed that the new test is clearly more powerful than the traditional ones when there exists no uniform dominance among involved distributions and in the presence of late differences. Its statistical power is also good in the other considered scenarios.  相似文献   

3.
本文实证检验了通货膨胀预期对企业投资行为的影响及其内部机理来分析通货膨胀预期变化的经济后果。研究发现,宏观预期通货膨胀率的上升会促使微观企业增加当期资本支出规模,但是降低投资效率。进一步分析结果表明:第一,预期通货膨胀率越高,高成长性企业的投资水平显著提高,投资效率显著下降,这一结果在低成长性企业中并不显著;第二,宏观预期通货膨胀率升高,银行更愿意借款给企业。这一结果在高成长性企业中同样显著,在低成长性企业中并不显著。这说明通货膨胀预期通过外部融资促使公司投资,高成长性企业比低成长性企业能够获得更多的银行贷款,从而增加资本投资,而低成长性企业受通货膨胀预期的影响被弱化。  相似文献   

4.
In this paper, an optimization model is developed for the economic design of a rectifying inspection sampling plan in the presence of two markets. A product with a normally distributed quality characteristic with unknown mean and variance is produced in the process. The quality characteristic has a lower specification limit. The aim of this paper is to maximize the profit, which consists the Taguchi loss function, under the constraints of satisfying the producer's and consumer's risk in two different markets simultaneously. Giveaway cost per unit of sold excess material is considered in the proposed model. A case study is presented to illustrate the application of proposed methodology. In addition, sensitivity analysis is performed to study the effect of model parameters on the expected profit and optimal solution. Optimal process adjustment problem and acceptance sampling plan is combined in the economical optimization model. Also, process mean and standard deviation are assumed to be unknown value, and their impact is analyzed. Finally, inspection error is considered, and its impact is investigated and analyzed.  相似文献   

5.
In this paper, two tests, based on weighted CUSUM of the least squares residuals, are studied to detect in real time a change-point in a nonlinear model. A first test statistic is proposed by extension of a method already used in the literature but for the linear models. It is tested under the null hypothesis, at each sequential observation, that there is no change in the model against a change presence. The asymptotic distribution of the test statistic under the null hypothesis is given and its convergence in probability to infinity is proved when a change occurs. These results will allow to build an asymptotic critical region. Next, in order to decrease the type I error probability, a bootstrapped critical value is proposed and a modified test is studied in a similar way. A generalization of the Hájek–Rényi inequality is established.  相似文献   

6.
Inference for a scalar interest parameter in the presence of nuisance parameters is considered in terms of the conditional maximum-likelihood estimator developed by Cox and Reid (1987). Parameter orthogonality is assumed throughout. The estimator is analyzed by means of stochastic asymptotic expansions in three cases: a scalar nuisance parameter, m nuisance parameters from m independent samples, and a vector nuisance parameter. In each case, the expansion for the conditional maximum-likelihood estimator is compared with that for the usual maximum-likelihood estimator. The means and variances are also compared. In each of the cases, the bias of the conditional maximum-likelihood estimator is unaffected by the nuisance parameter to first order. This is not so for the maximum-likelihood estimator. The assumption of parameter orthogonality is crucial in attaining this result. Regardless of parametrization, the difference in the two estimators is first-order and is deterministic to this order.  相似文献   

7.
A generalization of the group screening technique for finding the non-negligible factors in a first order model is presented. In conventional screening designs, each factor is assigned to a single factor group, and the sum of effects associated with each group is estimated in the first stage. The new designs assign a factor to multiple groups in the first stage. An individual effect is estimated in the second stage only if each group to which the corresponding factor is assigned is “active” in the first stage. The performance of these designs is compared to conventional group screening designs for cases in which the direction of each factor, if active, is assumed known a priori and measurement error is negligible.  相似文献   

8.
选择1995—2013年北京、上海和广州FDI和入境商务旅游(IBT)的时间序列数据,运用协整检验、格兰杰因果检验和弹性系数分析法分析二者的关系。研究发现:二者存在长期稳定的协整关系;IBT是FDI的格兰杰原因,但FDI是IBT的格兰杰原因在三地呈现出一定的差异性:北京的外企数量和外企投资均不是IBT的格兰杰原因;上海的外企投资是IBT的格兰杰原因而外企数量不是;广州的外企数量是IBT的格兰杰原因而外企投资不是。弹性系数分析表明,IBT对外企投资的弹性系数均高于IBT对外企数量的弹性系数,说明IBT对外企投资的带动作用高于对外企数量的带动。北、上、广三地的对比分析表明,二者的互动关系在上海和广州显著,在北京不显著,但北京的IBT对FDI的带动作用最大,这与三地吸引FDI的城市定位以及IBT的发达程度均有关。  相似文献   

9.
The Cox proportional hazards model is widely used in time-to-event analysis. Two time scales are used in practice: time-on-study and chronological age. The former is the most frequently used time scale in clinical studies and longitudinal observation studies. However, there is no general consensus about which time scale is the best. It has been asserted that if the cumulative baseline hazard is exponential or if the age-at-entry is independent of the covariate, then the two models are equivalent. We show that neither of these conditions leads to equivalency. Variability in the age-at-entry of individuals in the study causes the models to differ significantly. This is shown both analytically and through a simulation study. Additionally, we show that the time-on-study model is more robust to changes in age-at-entry than the chronological age model.  相似文献   

10.
The purpose of this study is to investigate agreement between item difficulty coefficients calculated relying on classical test theory and item response theory with Bland–Altman method. According to results, although there is a high correlation between Pj and b coefficient estimated with HGLM (hierarchical generalized linear model), 1P, and 3P models, it can be said that there is no agreement between two methods and cannot be used interchangeably. It is observed that the confidence limit is wide according to Bland–Altman graphics. Therefore, it can be said that there is no agreement between item difficulty values obtained from two methods. Bland–Altman method which is used in clinical studies mostly is suggested to be used in the comparison of methods used especially in the evaluation of student performance in education, in agreement studies among specialist considerations especially in terms of providing additional information to the studies in which correlation coefficient is calculated.  相似文献   

11.
Summary.  Asymmetry is a feature of shape which is of particular interest in a variety of applications. With landmark data, the essential information on asymmetry is contained in the degree to which there is a mismatch between a landmark configuration and its relabelled and matched reflection. This idea is explored in the context of a study of facial shape in infants, where particular interest lies in identifying changes over time and in assessing residual deformity in children who have had corrective surgery for a cleft lip or cleft lip and palate. Interest lies not in whether the mean shape is asymmetric but in comparing the degrees of asymmetry in different populations. A decomposition of the asymmetry score into components that are attributable to particular features of the face is proposed. A further decomposition allows different sources of asymmetry due to position, orientation or intrinsic asymmetry to be identified for each feature. The methods are also extended to data representing anatomical curves across the face.  相似文献   

12.
This paper proposes a new method for estimating the parameters of Lorenz Curves (LC’s) and fitting LC’s to observed data. The method is very general. It is applicable to any family of LC’s as long as it is given in closed form which is often the case in practice. The method can also be applied to either the LC or to its associated distribution. The estimators are easy to compute as they are obtained one at a time by solving only one equation in one unknown and in many cases the solutions are given in closed-forms. An additional advantage, that is not shared with the currently used method of estimation, is that the method is invariant as to the specification of which variable is written as a function of the other in the LC form. The method is applied to the most commonly suggested LC’s families. An example of real-life data is used to illustrate the methodology. A simulation study is performed to study the properties of the proposed estimators and to compare them with existing ones. The results seem to indicate that the proposed estimators have good properties and they often perform much better than the existing ones.  相似文献   

13.
The problem of interpreting lung-function measurements in industrial workers is examined. Two common lung-function measurements (FEV1, and FVC) are described. The standard method currently used in the analysis of such cross-sectional survey data is discussed. The basic assumption of a linear decline with age is questioned on the basis of large sets of data from a variety of industries in British Columbia. It is shown that, while the linear assumption holds approximately in unexposed. healthy nonsmoking individuals, a quadratic age effect is often observed in smokers and/or in individuals who are industrially exposed to certain fumes or dusts. Recognizing this accelerated rate of deterioration in the lungs is of fundamental importance both to the identification of affected individuals and to the understanding of the process involved. An attempt is made to interpret the variety of nonlinear situations observed, by appealing to population selection mechanisms, individual variations in susceptibility, and the effects due to various levels of stimulus strength.  相似文献   

14.
基于1980-2005年31个省区的时空数据,分析了中国水泥消费量与经济发展的关系及地域需求模型,结果发现:25年来中国水泥消费量随着人口和人均GDP的增长而呈同步增长态势,水泥消费总量是人口总量与人均GDP及固定资产投资的Cobb-Douglas函数,水泥消费量与固定资产投资总额呈双对数关系;依据1996年、2000年、2005年三个时段截面数据的分析,发现31个省区水泥消费量与人口总量、人均GDP(或人均固定资产投资额)具有Cobb-Douglas函数双因素驱动特征,给出了3个时段的模拟结果,并进行了有关弹性特征的分析,为中国水泥生产和地区布局规划提供了科学依据。  相似文献   

15.
Surveillance to detect changes of spatial patterns is of interest in many areas such as environmental control and regional analysis. Here the interaction parameter of the Ising model, is considered. A minimal sufficient statistic and its asymptotic distribution are used. It is demonstrated that the convergence to normal, distribution is rapid. The main result is that when the lattice is large, all approximations are better in several respects. It is shown that, for large lattice sizes, earlier results on surveillance of a normally distributed random variable can be used in cases of most interest. The expected delay of alarm at a fixed level of false alarm probability is examined for some examples.  相似文献   

16.
It is well known that heterogeneity between studies in a meta-analysis can be either caused by diversity, for example, variations in populations and interventions, or caused by bias, that is, variations in design quality and conduct of the studies. Heterogeneity that is due to bias is difficult to deal with. On the other hand, heterogeneity that is due to diversity is taken into account by a standard random-effects model. However, such a model generally assumes that heterogeneity does not vary according to study-level variables such as the size of the studies in the meta-analysis and the type of study design used. This paper develops models that allow for this type of variation in heterogeneity and discusses the properties of the resulting methods. The models are fitted using the maximum-likelihood method and by modifying the Paule–Mandel method. Furthermore, a real-world argument is given to support the assumption that the inter-study variance is inversely proportional to study size. Under this assumption, the corresponding random-effects method is shown to be connected with standard fixed-effect meta-analysis in a way that may well appeal to many clinicians. The models and methods that are proposed are applied to data from two large systematic reviews.  相似文献   

17.
The robustness of confidence intervals for a scale parameter based on M-esimators is studied, especially in small size samples. The coverage probablity is used as measure of robustness. A theorem for a lower bound of the minimum coverage probability of M-estimators is presented and it is applied in order to examine the behavior of the standard deviation and the median absolute deviation, as interval estimators. This bound can confirm the robustness of any other scale M-estimator in interval estimation. The idea of stretching is used to formulate the family of distributions that are considered as underlying. Critical values for the confidence interval are computed where it is needed, that is for the median absolute deviation in the Normal, Uniform and Cauchy distribution and for the standard deviation in the Uniform and Cauchy distribution. Simulation results have been achieved for the estimation of the coverage probabilities and the critical values.  相似文献   

18.
ABSTRACT

We develop here an alternative information theoretic method of inference of problems in which all of the observed information is in terms of intervals. We focus on the unconditional case in which the observed information is in terms the minimal and maximal values at each period. Given interval data, we infer the joint and marginal distributions of the interval variable and its range. Our inferential procedure is based on entropy maximization subject to multidimensional moment conditions and normalization in which the entropy is defined over discretized intervals. The discretization is based on theory or empirically observed quantities. The number of estimated parameters is independent of the discretization so the level of discretization does not change the fundamental level of complexity of our model. As an example, we apply our method to study the weather pattern for Los Angeles and New York City across the last century.  相似文献   

19.
20.
Given two random samples of equal size from two normal distributions with common mean but possibly different variances, we examine the sampling performance of the pre-test estimator for the common mean after a preliminary test for equality of variances. It is shown that when the alternative in the pretest is one-sided, the Graybill-Deal estimator is dominated by the pre-test estimator if the critical value is chosen appropriately. It is also shown that all estimators, the grand mean, the Graybill-Deal estimator and the pre-test estimator, are admissible when the alternative in the pre-test is two-sided. The optimal critical values in the two-sided pre-test are sought based on the minimax regret and the minimum average risk criteria, and it is shown that the Graybill-Deal estimator is most preferable under the minimum average risk criterion when the alternative in the pre-test is two-sided.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号