首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Fosdick and Raftery (2012) recently encountered the problem of inference for a bivariate normal correlation coefficient ρ with known variances. We derive a variance-stabilizing transformation y(ρ) analogous to Fisher’s classical z-transformation for the unknown-variance case. Adjusting y for the sample size n produces an improved “confidence-stabilizing” transformation yn(ρ) that provides more accurate interval estimates for ρ than the known-variance MLE. Interestingly, the z transformation applied to the unknown-but-equal-variance MLE performs well in the known-variance case for smaller values of |ρ|. Both methods are useful for comparing two or more correlation coefficients in the known-variance case.  相似文献   

2.
A plot of each ranking of N objects in N-dimensional space is shown to provide geometric interpretations of Kendall's tau and Spearman's rho and also of the relationship of rho to a sum of inversion weights. The computation of rho from a sum of inversion weights is shown to allow sequential calculation of rho.  相似文献   

3.
We proposed two simple moment-based procedures, one with (GCCC1) and one without (GCCC2) normality assumptions, to generalize the inference of concordance correlation coefficient for the evaluation of agreement among multiple observers for measurements on a continuous scale. A modified Fisher's Z-transformation was adapted to further improve the inference. We compared the proposed methods with U-statistic-based inference approach. Simulation analysis showed desirable statistical properties of the simplified approach GCCC1, in terms of coverage probabilities and coverage balance, especially for small samples. GCCC2, which is distribution-free, behaved comparably with the U-statistic-based procedure, but had a more intuitive and explicit variance estimator. The utility of these approaches were illustrated using two clinical data examples.  相似文献   

4.
A Gaussian copula is widely used to define correlated random variables. To obtain a prescribed Pearson correlation coefficient of ρx between two random variables with given marginal distributions, the correlation coefficient ρz between two standard normal variables in the copula must take a specific value which satisfies an integral equation that links ρx to ρz. In a few cases, this equation has an explicit solution, but in other cases it must be solved numerically. This paper attempts to address this issue. If two continuous random variables are involved, the marginal transformation is approximated by a weighted sum of Hermite polynomials; via Mehler’s formula, a polynomial of ρz is derived to approximate the function relationship between ρx and ρz. If a discrete variable is involved, the marginal transformation is decomposed into piecewise continuous ones, and ρx is expressed as a polynomial of ρz by Taylor expansion. For a given ρx, ρz can be efficiently determined by solving a polynomial equation.  相似文献   

5.
Kappa and B assess agreement between two observers independently classifying N units into k categories. We study their behavior under zero cells in the contingency table and unbalanced asymmetric marginal distributions. Zero cells arise when a cross-classification is never endorsed by both observers; biased marginal distributions occur when some categories are preferred differently between the observers. Simulations studied the distributions of the unweighted and weighted statistics for k=4, under fixed proportions of diagonal agreement and different patterns off-diagonal, with various sample sizes, and under various zero cell count scenarios. Marginal distributions were first uniform and homogeneous, and then unbalanced asymmetric distributions. Results for unweighted kappa and B statistics were comparable to work of Muñoz and Bangdiwala, even with zero cells. A slight increased variation was observed as the sample size decreased. Weighted statistics did show greater variation as the number of zero cells increased, with weighted kappa increasing substantially more than weighted B. Under biased marginal distributions, weighted kappa with Cicchetti weights were higher than with squared weights. Both statistics for observer agreement behaved well under zero cells. The weighted B was less variable than the weighted kappa under similar circumstances and different weights. In general, B's performance and graphical interpretation make it preferable to kappa under the studied scenarios.  相似文献   

6.
The most popular method for trying to detect an association between two random variables is to test H 0 ?:?ρ=0, the hypothesis that Pearson's correlation is equal to zero. It is well known, however, that Pearson's correlation is not robust, roughly meaning that small changes in any distribution, including any bivariate normal distribution as a special case, can alter its value. Moreover, the usual estimate of ρ, r, is sensitive to only a few outliers which can mask a true association. A simple alternative to testing H 0 ?:?ρ =0 is to switch to a measure of association that guards against outliers among the marginal distributions such as Kendall's tau, Spearman's rho, a Winsorized correlation, or a so-called percentage bend correlation. But it is known that these methods fail to take into account the overall structure of the data. Many measures of association that do take into account the overall structure of the data have been proposed, but it seems that nothing is known about how they might be used to detect dependence. One such measure of association is selected, which is designed so that under bivariate normality, its estimator gives a reasonably accurate estimate of ρ. Then methods for testing the hypothesis of a zero correlation are studied.  相似文献   

7.
One common method for analyzing data in experimental designs when observations are missing was devised by Yates (1933), who developed his procedure based upon a suggestion by R. A. Fisher. Considering a linear model with independent, equi-variate errors, Yates substituted algebraic values for the missing data and then minimized the error sum of squares with respect to both the unknown parameters and the algebraic values. Yates showed that this procedure yielded the correct error sum of squares and a positively biased hypothesis sum of squares.

Others have elaborated on this technique. Chakrabarti (1962) gave a formal proof of Fisher's rule that produced a way to simplify the calculations of the auxiliary values to be used in place of the missing observations. Kshirsagar (1971) proved that the hypothesis sum of squares based on these values was biased, and developed an easy way to compute that bias. Sclove  相似文献   

8.
The distribution of the sample correlation coefficient is derived when the population is a mixture of two bivariate normal distributions with zero mean but different covariances and mixing proportions 1 - λ and λ respectively; λ will be called the proportion of contamination. The test of ρ = 0 based on Student's t, Fisher's z, arcsine, or Ruben's transformation is shown numerically to be nonrobust when λ, the proportion of contamination, lies between 0.05 and 0.50 and the contaminated population has 9 times the variance of the standard (bivariate normal) population. These tests are also sensitive to the presence of outliers.  相似文献   

9.
ABSTRACT

Asymptotic distributions of the standardized estimators of the squared and non squared multiple correlation coefficients under nonnormality were obtained using Edgeworth expansion up to O(1/n). Conditions for the normal-theory asymptotic biases and variances to hold under nonnormality were derived with respect to the parameter values and the weighted sum of the cumulants of associated variables. The condition for the cumulants indicates a compensatory effect to yield the robust normal-theory lower-order cumulants. Simulations were performed to see the usefulness of the formulas of the asymptotic expansions using the model with the asymptotic robustness under nonnormality, which showed that the approximations by Edgeworth expansions were satisfactory.  相似文献   

10.
Abstract

In his Fisher Lecture, Efron (Efron, B. R. A. (1998 Efron, B. R. A. 1998. Fisher in the 21st century (with discussion). Statistical Science, 13: 95122. [Crossref], [Web of Science ®] [Google Scholar]). Fisher in the 21st Century (with discussion). Statistical Science 13:95–122) pointed out that maximum likelihood estimates (MLE) can be badly biased in certain situations involving many nuisance parameters. He predicted that with modern computing equipment a computer-modified version of the MLE that was less biased could become the default estimator of choice in applied problems in the 21st century. This article discusses three modifications—Lindsay's conditional likelihood, integrated likelihood, and Bartlett's bias-corrected estimating function. Each is evaluated through a study of the bias and MSE of the estimates in a stratified Weibull model with a moderate number of nuisance parameters. In Lindsay's estimating equation, three different methods for estimation of the nuisance parameters are evaluated—the restricted maximum likelihood estimate (RMLE), a Bayes estimator, and a linear Bayes estimator. In our model, the conditional likelihood with RMLE of the nuisance parameters is equivalent to Bartlett's bias-corrected estimating function. In the simulation we show that Lindsay's conditional likelihood is in general preferred, irrespective of the estimator of the nuisance parameters. Although the integrated likelihood has smaller MSE when the precise nature of the prior distribution of the nuisance parameters is known, this approach may perform poorly in cases where the prior distribution of the nuisance parameters is not known, especially using a non-informative prior. In practice, Lindsay's method using the RMLE of the nuisance parameters is recommended.  相似文献   

11.
Three new weighted rank correlation coefficients are proposed which are sensitive to both agreement on top and bottom rankings. The first one is based on the weighted rank correlation coefficient proposed by Maturi and Abdelfattah [13 T.A. Maturi and E.H. Abdelfattah, A new weighted rank correlation, J. Math. Stat. 4 (2008), pp. 226230. doi: 10.3844/jmssp.2008.226.230[Crossref] [Google Scholar]], the second and the third are based on the order statistics and the quantiles of the Laplace distribution, respectively. The limiting distributions of the new correlation coefficients under the null hypothesis of no association between the rankings are presented, and a summary of the exact and approximate quantiles for these coefficients is provided. A simulation study is performed to compare the performance of Kendall's tau, Spearman's rho, and the new weighted rank correlation coefficients in detecting the agreement on the top and the bottom rankings simultaneously. Finally, examples are given for illustration purposes, including a real data set from financial market indices.  相似文献   

12.
We deal with the asymptotic expansions of the means and the variances of the correlation coefficients in truncated bivariate normal populations. The Fisher's z-transformation is generalized for stabilizing variance in a truncated normal population. The Hermite moments are introduced, and the relationship among cross moments, central cross moments, and Hermite moments are discussed.  相似文献   

13.
ABSTRACT

The correlation coefficient (CC) is a standard measure of a possible linear association between two continuous random variables. The CC plays a significant role in many scientific disciplines. For a bivariate normal distribution, there are many types of confidence intervals for the CC, such as z-transformation and maximum likelihood-based intervals. However, when the underlying bivariate distribution is unknown, the construction of confidence intervals for the CC is not well-developed. In this paper, we discuss various interval estimation methods for the CC. We propose a generalized confidence interval for the CC when the underlying bivariate distribution is a normal distribution, and two empirical likelihood-based intervals for the CC when the underlying bivariate distribution is unknown. We also conduct extensive simulation studies to compare the new intervals with existing intervals in terms of coverage probability and interval length. Finally, two real examples are used to demonstrate the application of the proposed methods.  相似文献   

14.
《Econometric Reviews》2013,32(1):25-52
Abstract

This paper argues that Fisher's paradox can be explained away in terms of estimator choice. We analyse by means of Monte Carlo experiments the small sample properties of a large set of estimators (including virtually all available single-equation estimators), and compute the critical values based on the empirical distributions of the t-statistics, for a variety of Data Generation Processes (DGPs), allowing for structural breaks, ARCH effects etc. We show that precisely the estimators most commonly used in the literature, namely OLS, Dynamic OLS (DOLS) and non-prewhitened FMLS, have the worst performance in small samples, and produce rejections of the Fisher hypothesis. If one employs the estimators with the most desirable properties (i.e., the smallest downward bias and the minimum shift in the distribution of the associated t-statistics), or if one uses the empirical critical values, the evidence based on US data is strongly supportive of the Fisher relation, consistently with many theoretical models.  相似文献   

15.
Under proper conditions, two independent tests of the null hypothesis of homogeneity of means are provided by a set of sample averages. One test, with tail probability P 1, relates to the variation between the sample averages, while the other, with tail probability P 2, relates to the concordance of the rankings of the sample averages with the anticipated rankings under an alternative hypothesis. The quantity G = P 1 P 2 is considered as the combined test statistic and, except for the discreteness in the null distribution of P 2, would correspond to the Fisher statistic for combining probabilities. Illustration is made, for the case of four means, on how to get critical values of G or critical values of P 1 for each possible value of P 2, taking discreteness into account. Alternative measures of concordance considered are Spearman's ρ and Kendall's τ. The concept results, in the case of two averages, in assigning two-thirds of the test size to the concordant tail, one-third to the discordant tail.  相似文献   

16.
Recently, Domma et al. [An extension of Azzalinis method, J. Comput. Appl. Math. 278 (2015), pp. 37–47] proposed an extension of Azzalini's method. This method can attract readers due to its flexibility and ease of applicability. Most of the weighted Weibull models that have been introduced are with monotonic hazard rate function. This fact limits their applicability. So, our aim is to build a new weighted Weibull distribution with monotonic and non-monotonic hazard rate function. A new weighted Weibull distribution, so-called generalized weighted Weibull (GWW) distribution, is introduced by a method exposed in Domma et al. [13]. GWW distribution possesses decreasing, increasing, upside-down bathtub, N-shape and M-shape hazard rate. Also, it is very easy to derive statistical properties of the GWW distribution. Finally, we consider application of the GWW model on a real data set, providing simulation study too.  相似文献   

17.
In this paper, Duncan's cost model combined Taguchi's quadratic loss function is applied to develop the economic-statistical design of the sum of squares exponentially weighted moving average (SS-EWMA) chart. The genetic algorithm is applied to search for the optimal decision variables of SS-EWMA chart such that the expected cost is minimized. Sensitivity analysis reveals that the optimal sample size and sampling interval decrease; optimal smoothing constant and control limit increase as the mean and/or variance increases. Moreover, the combination of optimal parameter levels in orthogonal array experiment plays an important guideline for monitoring the process mean and/or variance.  相似文献   

18.
Exact powers of four classical tests in a GMANOVA model are compared numerically when the order of the error sum of square matrix is 2. The four tests are likelihood ratio (=LR), Pillai's V, Hotelling's T 2, and Roy's largest root tests. It turns out that for small sizes, there are a few cases in which Rothenberg's condition for the relative magnitude of asymptotic powers of three standard tests does not hold.  相似文献   

19.
We derive two C(α) statistics and the likelihood-ratio statistic for testing the equality of several correlation coefficients, from k ≥ 2 independent random samples from bivariate normal populations. The asymptotic relationship of the C(α) tests, the likelihood-ratio test, and a statistic based on the normality assumption of Fisher's Z-transform of the sample correlation coefficient is established. A comparative performance study, in terms of size and power, is then conducted by Monte Carlo simulations. The likelihood-ratio statistic is often too liberal, and the statistic based on Fisher's Z-transform is conservative. The performance of the two C(α) statistics is identical. They maintain significance level well and have almost the same power as the other statistics when empirically calculated critical values of the same size are used. The C(α) statistic based on a noniterative estimate of the common correlation coefficient (based on Fisher's Z-transform) is recommended.  相似文献   

20.
Cohen's kappa coefficient is traditionally used to quantify the degree of agreement between two raters on a nominal scale. Correlated kappas occur in many settings (e.g., repeated agreement by raters on the same individuals, concordance between diagnostic tests and a gold standard) and often need to be compared. While different techniques are now available to model correlated κ coefficients, they are generally not easy to implement in practice. The present paper describes a simple alternative method based on the bootstrap for comparing correlated kappa coefficients. The method is illustrated by examples and its type I error studied using simulations. The method is also compared with the generalized estimating equations of the second order and the weighted least-squares methods.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号