共查询到20条相似文献,搜索用时 531 毫秒
1.
Sanaullah et al. (2014) have suggested generalized exponential chain ratio estimators under stratified two-phase sampling scheme for estimating the finite population mean. However, the bias and mean square error (MSE) expressions presented in that work need some corrections, and consequently the study based on efficiency comparison also requires corrections. In this article, we revisit Sanaullah et al. (2014) estimator and provide the correct bias and MSE expressions of their estimator. We also propose an estimator which is more efficient than several competing estimators including the classes of estimators in Sanaullah et al. (2014). Three real datasets are used for efficiency comparisons. 相似文献
2.
Gauss M. Cordeiro Marcelo Bourguignon Edwin M. M. Ortega Thiago G. Ramires 《统计学通讯:理论与方法》2018,47(5):1050-1070
The construction of some wider families of continuous distributions obtained recently has attracted applied statisticians due to the analytical facilities available for easy computation of special functions in programming software. We study some general mathematical properties of the log-gamma-generated (LGG) family defined by Amini, MirMostafaee, and Ahmadi (2014). It generalizes the gamma-generated class pioneered by Risti? and Balakrishnan (2012). We present some of its special models and derive explicit expressions for the ordinary and incomplete moments, generating and quantile functions, mean deviations, Bonferroni and Lorenz curves, Shannon entropy, Rényi entropy, reliability, and order statistics. Models in this family are compared with nested and non nested models. Further, we propose and study a new LGG family regression model. We demonstrate that the new regression model can be applied to censored data since it represents a parametric family of models and therefore can be used more effectively in the analysis of survival data. We prove that the proposed models can provide consistently better fits in some applications to real data sets. 相似文献
3.
Techniques used in variability assessment are subsequently used to draw conclusions regarding the “spread”/uniformity of data curves. Due to the limitations of these techniques, they are not adequate for circumstances where data manifest with multiple peaks. Examples of these manifestations (in three-dimensional space) include under-foot pressure distributions recorded for different types of footwear (Becerro-de-Bengoa-Vallejo et al., 2014; Cibulka et al., 1994; Davies et al., 2003), surface textures and interfaces designed to impact friction, and and and molecular surface structures such as viral epitopes (Torras and Garcia-Valls, 2004; Pacejka, 1997; Fustaffson, 1997). This article proposes a technique for generating a single variable – Λ that will quantify the uniformity of such surfaces. We define and validate this technique using several mathematical and graphical models. 相似文献
4.
Two-period crossover design is one of the commonly used designs in clinical trials. But, the estimation of treatment effect is complicated by the possible presence of carryover effect. It is known that ignoring the carryover effect when it exists can lead to poor estimates of the treatment effect. The classical approach by Grizzle (1965) consists of two stages. First, a preliminary test is conducted on carryover effect. If the carryover effect is significant, analysis is based only on data from period one; otherwise, analysis is based on data from both periods. A Bayesian approach with improper priors was proposed by Grieve (1985) which uses a mixture of two models: a model with carryover effect and another without. The indeterminacy of the Bayes factor due to the arbitrary constant in the improper prior was addressed by assigning a minimally discriminatory value to the constant. In this article, we present an objective Bayesian estimation approach to the two-period crossover design which is also based on a mixture model, but using the commonly recommended Zellner–Siow g-prior. We provide simulation studies and a real data example and compare the numerical results with Grizzle (1965)’s and Grieve (1985)’s approaches. 相似文献
5.
This paper treats the problem of stochastic comparisons for the extreme order statistics arising from heterogeneous beta distributions. Some sufficient conditions involved in majorization-type partial orders are provided for comparing the extreme order statistics in the sense of various magnitude orderings including the likelihood ratio order, the reversed hazard rate order, the usual stochastic order, and the usual multivariate stochastic order. The results established here strengthen and extend those including Kochar and Xu (2007), Mao and Hu (2010), Balakrishnan et al. (2014), and Torrado (2015). A real application in system assembly and some numerical examples are also presented to illustrate the theoretical results. 相似文献
6.
The Hosmer–Lemeshow test is a widely used method for evaluating the goodness of fit of logistic regression models. But its power is much influenced by the sample size, like other chi-square tests. Paul, Pennell, and Lemeshow (2013) considered using a large number of groups for large data sets to standardize the power. But simulations show that their method performs poorly for some models. In addition, it does not work when the sample size is larger than 25,000. In the present paper, we propose a modified Hosmer–Lemeshow test that is based on estimation and standardization of the distribution parameter of the Hosmer–Lemeshow statistic. We provide a mathematical derivation for obtaining the critical value and power of our test. Through simulations, we can see that our method satisfactorily standardizes the power of the Hosmer–Lemeshow test. It is especially recommendable for enough large data sets, as the power is rather stable. A bank marketing data set is also analyzed for comparison with existing methods. 相似文献
7.
Pao-Sheng Shen 《统计学通讯:理论与方法》2017,46(4):1916-1926
The complication in analyzing tumor data is that the tumors detected in a screening program tend to be slowly progressive tumors, which is the so-called left-truncated sampling that is inherent in screening studies. Under the assumption that all subjects have the same tumor growth function, Ghosh (2008) developed estimation procedures for the Cox proportional hazards model. Shen (2011a) demonstrated that Ghosh (2008)'s approach can be extended to the case when each subject has a specific growth function. In this article, under linear transformation model, we present a general framework to the analysis of data from cancer screening studies. We developed estimation procedures under linear transformation model, which includes Cox's model as a special case. A simulation study is conducted to demonstrate the potential usefulness of the proposed estimators. 相似文献
8.
This article proposes new symmetric and asymmetric distributions applying methods analogous as the ones in Kim (2005) and Arnold et al. (2009) to the exponentiated normal distribution studied in Durrans (1992), that we call the power-normal (PN) distribution. The proposed bimodal extension, the main focus of the paper, is called the bimodal power-normal model and is denoted by BPN(α) model, where α is the asymmetry parameter. The authors give some properties including moments and maximum likelihood estimation. Two important features of the model proposed is that its normalizing constant has closed and simple form and that the Fisher information matrix is nonsingular, guaranteeing large sample properties of the maximum likelihood estimators. Finally, simulation studies and real applications reveal that the proposed model can perform well in both situations. 相似文献
9.
Recently, conditional Renyi’s divergence of order α and Kerridge’s inaccuracy measures are studied by Navarro et al. (2014). In the present article, a generalized dynamic conditional Kerridge’s inaccuracy measure is introduced, which can be represented as the sum of conditional Renyi’s divergence and Renyi’s entropy. Some useful bounds are obtained using the concept of likelihood ratio order. The results are extended to weighted distributions. Sufficient conditions are obtained for the monotonicity of the proposed measure. Characterizations for bivariate exponential conditional distribution are presented based on the proposed measure. 相似文献
10.
Recently, Koyuncu et al. (2013) proposed an exponential type estimator to improve the efficiency of mean estimator based on randomized response technique. In this article, we propose an improved exponential type estimator which is more efficient than the Koyuncu et al. (2013) estimator, which in turn was shown to be more efficient than the usual mean estimator, ratio estimator, regression estimator, and the Gupta et al. (2012) estimator. Under simple random sampling without replacement (SRSWOR) scheme, bias and mean square error expressions for the proposed estimator are obtained up to first order of approximation and comparisons are made with the Koyuncu et al. (2013) estimator. A simulation study is used to observe the performances of these two estimators. Theoretical findings are also supported by a numerical example with real data. We also show how to, extend the proposed estimator to the case when more than one auxiliary variable is available. 相似文献
11.
This article recasts the optimal allocations of coverage limits for two independent random losses. Under some regularity conditions on the two concerned probability density functions, we build the sufficient and necessary condition for the existence of the optimal allocation of coverage limits, and derive the optimal allocation whenever they do exist. The results supplement Lu and Meng (2011, Proposition 5.2) and Hu and Wang (2014, Theorem 5.1). 相似文献
12.
Repeated measurement designs are widely used in medicine, pharmacology, animal sciences, and psychology. In this paper the works of Iqbal and Tahir (2009) and Iqbal, Tahir, and Ghazali (2010) are generalized for the construction of circular-balanced and circular strongly balanced repeated measurements designs through the method of cyclic shifts for three periods. 相似文献
13.
The generalized inverse Weibull distribution is a newlife time probability distribution which can be used to model a variety of failure characteristics. It has several desirable properties and nice physical interpretations which enable them to be used frequently. In this article, we present a chi-squared goodness-of-fit test for an accelerated failure time (AFT) model with generalized inverse Weibull distribution (GIW) as the baseline distribution, in both of complete and censored data. This test is based on a modification of the NRR (Nikulin-Rao-Robson) statistic Y2, proposed by Bagdonavicius and Nikulin (2011), for censored data. Two applications of real data are given to illustrate the potentiality of the proposed test. 相似文献
14.
In this article, we establish the complete moment convergence of a moving-average process generated by a class of random variables satisfying the Rosenthal-type maximal inequality and the week mean dominating condition. On the one hand, we give the correct proof for the case p = 1 in Ko (2015); on the other hand, we also consider the case αp = 1 which was not considered in Ko (2015). The results obtained in this article generalize some corresponding ones for some dependent sequences. 相似文献
15.
In analogy with the weighted Shannon entropy proposed by Belis and Guiasu (1968) and Guiasu (1986), we introduce a new information measure called weighted cumulative residual entropy (WCRE). This is based on the cumulative residual entropy (CRE), which is introduced by Rao et al. (2004). This new information measure is “length-biased” shift dependent that assigns larger weights to larger values of random variable. The properties of WCRE and a formula relating WCRE and weighted Shannon entropy are given. Related studies of reliability theory is covered. Our results include inequalities and various bounds to the WCRE. Conditional WCRE and some of its properties are discussed. The empirical WCRE is proposed to estimate this new information measure. Finally, strong consistency and central limit theorem are provided. 相似文献
16.
The probability matching prior for linear functions of Poisson parameters is derived. A comparison is made between the confidence intervals obtained by Stamey and Hamilton (2006), and the intervals derived by us when using the Jeffreys’ and probability matching priors. The intervals obtained from the Jeffreys’ prior are in some cases fiducial intervals (Krishnamoorthy and Lee, 2010). A weighted Monte Carlo method is used for the probability matching prior. The power and size of the test, using Bayesian methods, is compared to tests used by Krishnamoorthy and Thomson (2004). The Jeffreys’, probability matching and two other priors are used. 相似文献
17.
Haifeng Xu 《统计学通讯:理论与方法》2017,46(7):3123-3134
In this article, assuming that the error terms follow a multivariate t distribution,we derive the exact formulae forthe moments of the heterogeneous preliminary test (HPT) estimator proposed by Xu (2012b). We also execute the numerical evaluation to investigate the mean squared error (MSE) performance of the HPT estimator and compare it with those of the feasible ridge regression (FRR) estimator and the usual ordinary least squared (OLS) estimator. Further, we derive the optimal critical values of the preliminary F test for the HPT estimator, using the minimax regret function proposed by Sawa and Hiromatsu (1973). Our results show that (1) the optimal significance level (α*) increases as the degrees of freedom of multivariate t distribution (ν0) increases; (2) when ν0 ? 10, the value of α* is close to that in the normal error case. 相似文献
18.
In this paper, the focus is on sequential analysis of multivariate financial time series with heavy tails. The mean vector and the covariance matrix of multivariate non linear models are simultaneously monitored by modifying conventional control charts to identify structural changes in the data. The considered target process is a constant conditional correlation model (cf. Bollerslev, 1990), an extended constant conditional correlation model (cf. He and Teräsvirta, 2004), a dynamic conditional correlation model (cf. Engle, 2002), or a generalized dynamic conditional correlation model (cf. Capiello et al., 2006). For statistical surveillance we use control charts based on residuals. Further, the procedures are constructed for t-distribution. The detection speed of these charts is compared via Monte Carlo simulation. In the empirical study, the procedure with the best performance is applied to log-returns of the stock market indices FTSE and CAC. 相似文献
19.
Since the seminal paper of Ghirardato (1997), it is known that Fubini theorem for non additive measures can be available only for functions as “slice-comonotonic” in the framework of product algebra. Later, inspired by Ghirardato (1997), Chateauneuf and Lefort (2008) obtained some Fubini theorems for non additive measures in the framework of product σ-algebra. In this article, we study Fubini theorem for non additive measures in the framework of g-expectation. We give some different assumptions that provide Fubini theorem in the framework of g-expectation. 相似文献
20.
This study considers efficient mixture designs for the approximation of the response surface of a quantile regression model, which is a second degree polynomial, by a first degree polynomial in the proportions of q components. Instead of least squares estimation in the traditional regression analysis, the objective function in quantile regression models is a weighted sum of absolute deviations and the least absolute deviations (LAD) estimation technique should be used (Bassett and Koenker, 1982; Koenker and Bassett, 1978). Therefore, the standard optimal mixture designs like the D-optimal or A-optimal mixture designs for the least squared estimation are not appropriate. This study explores mixture designs that minimize the bias between the approximated 1st-degree polynomial and a 2nd-degree polynomial response surfaces by the LAD estimation. In contrast to the standard optimal mixture designs for the least squared estimation, the efficient designs might contain elementary centroid design points of degrees higher than two. An example of a portfolio with five assets is given to illustrate the proposed efficient mixture designs in determining the marginal contribution of risks by individual assets in the portfolio. 相似文献