首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
The concept of neighbor designs was introduced and defined by Rees (1967 Rees, D.H. (1967). Some designs of use in serology. Biometrics 23:779791.[Crossref], [PubMed], [Web of Science ®] [Google Scholar]) along with giving some methods of their construction. Henceforth, many methods of construction of neighbor designs as well as of their generalizations are available in the literature. However, there are only few results on their optimality. Therefore, the purpose of this article is to give an overview of study on this problem. Recent results on optimality of specified neighbor balanced designs under various interference models with block effects are presented and then these results are compared with respective models where block effects are not significant.  相似文献   

2.
The testing of the stratum effects in the Cox model is an important and commonly asked question in medical research as well as in many other fields. In this paper, we will discuss the problem where one observes interval-censored failure time data and generalize the procedure given in Sun and Yang (2000 Sun, J., and I. Yang. 2000. Nonparametric test for stratum effects in the cox model. Lifetime Data Analysis 6:32130.[Crossref], [PubMed], [Web of Science ®] [Google Scholar]) for right-censored data. The asymptotic distribution of the new test statistic is established and the simulation study conducted for the evaluation of the finite sample properties of the method suggests that the generalized procedure seems to work well for practical situations. An application is provided.  相似文献   

3.
In this paper, we develop a zero-inflated NGINAR(1) process as an alternative to the NGINAR(1) process (Risti?, Nasti?, and Bakouch 2009 Risti?, M. M., A. S. Nasti?, and H. S. Bakouch. 2009. A new geometric first-order integer-valued autoregressive (NGINAR(1)) process. Journal of Statistical Planning and Inference 139:221826.[Crossref], [Web of Science ®] [Google Scholar]) when the number of zeros in the data is larger than the expected number of zeros by the geometric process. The proposed process has zero-inflated geometric marginals and contains the NGINAR(1) process as a particular case. In addition, various properties of the new process are derived such as conditional distribution and autocorrelation structure. Yule-Walker, probability based Yule-Walker, conditional least squares and conditional maximum likelihood estimators of the model parameters are derived. An extensive Monte Carlo experiment is conducted to evaluate the performances of these estimators in finite samples. Forecasting performances of the model are discussed. Application to a real data set shows the flexibility and potentiality of the new model.  相似文献   

4.
We make some comments about the paper of Yildiz (2017 Yildiz, N. 2017. On the weighted mixed Liu-type estimator under unbiased stochastic restrictions. Communications in Statistics Simulation and Computation 46 (9):723848. do?:10.1080/03610918.2016.1235189.[Taylor &; Francis Online], [Web of Science ®] [Google Scholar]) and correct the theorems in that paper.  相似文献   

5.
The aim of this letter to acknowledge of priority on calibration estimation. There are numerous studies on calibration estimation in literature. The studies on calibration estimation are reviewed and it is found out that an existing calibration estimator is reprocessed in the recent paper published by Nidhi et al. (2007 Nidhi, B. V. S. Sisodia, Subedar Singh, and Sanjay K. Singh. 2017. Calibration approach estimation of the mean in stratified sampling and stratified double sampling. Commun.Statist.Theor.Meth. 46 (10):49324942.[Taylor &; Francis Online], [Web of Science ®] [Google Scholar]).  相似文献   

6.
Empirical Bayes (EB) methods are very useful for post selection inference. Following Datta et al. (2002 Datta, G. S., M. Ghosh, D. D. Smith, and P. Lahiri. 2002. On an asymptotic theory of conditional and unconditional coverage probabilities of empirical Bayes confidence intervals. Scandinavian Journal of Statistics 29:13952.[Crossref], [Web of Science ®] [Google Scholar]), we construct EB confidence intervals for the selected population mean. The EB intervals are adjusted to achieve the target coverage probabilities asymptotically up to the second order. Both unconditional coverage probabilities of EB intervals and corresponding probabilities conditional on ancillary statistics are found.  相似文献   

7.
This paper revisits two bivariate Pareto models for fitting competing risks data. The first model is the Frank copula model, and the second one is a bivariate Pareto model introduced by Sankaran and Nair (1993 Sankaran, P. G., and N. U. Nair. 1993. A bivariate Pareto model and its applications to reliability. Naval Research Logistics 40 (7):10131020. doi:10.1002/1520-6750(199312)40:7%3c1013::AID-NAV3220400711%3e3.0.CO;2-7.[Crossref], [Web of Science ®] [Google Scholar]). We discuss the identifiability issues of these models and develop the maximum likelihood estimation procedures including their computational algorithms and model-diagnostic procedures. Simulations are conducted to examine the performance of the maximum likelihood estimation. Real data are analyzed for illustration.  相似文献   

8.
This article presents a new class of realized stochastic volatility model based on realized volatilities and returns jointly. We generalize the traditionally used logarithm transformation of realized volatility to the Box–Cox transformation, a more flexible parametric family of transformations. A two-step maximum likelihood estimation procedure is introduced to estimate this model on the basis of Koopman and Scharth (2013 Koopman, S.J., Scharth, M. (2013), The Analysis of Stochastic Volatility in the Presence of Daily Realised Measures, Journal of Financial Econometrics, 11, 76115.[Crossref], [Web of Science ®] [Google Scholar]). Simulation results show that the two-step estimator performs well, and the misspecified log transformation may lead to inaccurate parameter estimation and certain excessive skewness and kurtosis. Finally, an empirical investigation on realized volatility measures and daily returns is carried out for several stock indices.  相似文献   

9.
Two-period crossover design is one of the commonly used designs in clinical trials. But, the estimation of treatment effect is complicated by the possible presence of carryover effect. It is known that ignoring the carryover effect when it exists can lead to poor estimates of the treatment effect. The classical approach by Grizzle (1965 Grizzle, J.E. (1965). The two-period change-over design and its use in clinical trials. Biometrics 21:467480. See Grizzle (1974) for corrections.[Crossref], [PubMed], [Web of Science ®] [Google Scholar]) consists of two stages. First, a preliminary test is conducted on carryover effect. If the carryover effect is significant, analysis is based only on data from period one; otherwise, analysis is based on data from both periods. A Bayesian approach with improper priors was proposed by Grieve (1985 Grieve, A.P. (1985). A Bayesian analysis of the two-period crossover design for clinical trials. Biometrics 41:979990.[Crossref], [PubMed], [Web of Science ®] [Google Scholar]) which uses a mixture of two models: a model with carryover effect and another without. The indeterminacy of the Bayes factor due to the arbitrary constant in the improper prior was addressed by assigning a minimally discriminatory value to the constant. In this article, we present an objective Bayesian estimation approach to the two-period crossover design which is also based on a mixture model, but using the commonly recommended Zellner–Siow g-prior. We provide simulation studies and a real data example and compare the numerical results with Grizzle (1965 Grizzle, J.E. (1965). The two-period change-over design and its use in clinical trials. Biometrics 21:467480. See Grizzle (1974) for corrections.[Crossref], [PubMed], [Web of Science ®] [Google Scholar])’s and Grieve (1985 Grieve, A.P. (1985). A Bayesian analysis of the two-period crossover design for clinical trials. Biometrics 41:979990.[Crossref], [PubMed], [Web of Science ®] [Google Scholar])’s approaches.  相似文献   

10.
This article suggests an efficient method of estimating a rare sensitive attribute which is assumed following Poisson distribution by using three-stage unrelated randomized response model instead of the Land et al. model (2011 Land, M., S. Singh, and S. A. Sedory. 2011. Estimation of a rare sensitive attribute using poisson distribution. Statistics 46 (3):35160. doi:10.1080/02331888.2010.524300.[Taylor &; Francis Online], [Web of Science ®] [Google Scholar]) when the population consists of some different sized clusters and clusters selected by probability proportional to size(:pps) sampling. A rare sensitive parameter is estimated by using pps sampling and equal probability two-stage sampling when the parameter of a rare unrelated attribute is assumed to be known and unknown.

We extend this method to the case of stratified population by applying stratified pps sampling and stratified equal probability two-stage sampling. An empirical study is carried out to show the efficiency of the two proposed methods when the parameter of a rare unrelated attribute is assumed to be known and unknown.  相似文献   

11.
A case–cohort design was proposed by Prentice (1986) Prentice, R.L. (1986). A case-cohort design for epidemiologic cohort studies and disease prevention trials. Biometrika 73:111.[Crossref], [Web of Science ®] [Google Scholar] in order to reduce costs. It involves the collection of covariate data from all subjects who experience the event of interest, and from the members of a random subcohort. This case–cohort design has been extensively studied, but is exclusively considered for right-censored data. In this article, we propose case–cohort designs adapted to length-biased data under the proportional hazards assumption. A pseudo-likelihood procedure is described for estimating parameters and the corresponding cumulative hazard function. The large sample properties, such as consistency and weak convergence, for such pseudo-likelihood estimators are presented. We also conduct simulation studies to show that the proposed estimators are appropriate for practical use. A real Oscar Awards data is provided.  相似文献   

12.
The purpose of mixture experiments is to explore the optimum blends of mixture components, which will provide the desirable response characteristics in finished products. D-optimal minimal designs have been considered for a variety of mixture models, including Scheffé's linear, quadratic, and cubic models. Usually, these D-optimal designs are minimally supported since they have just as many design points as the number of parameters. Thus, they lack the degrees of freedom to perform the lack-of-fit (LOF) tests. Also, the majority of the design points in D-optimal minimal designs are on the boundary: vertices, edges, or faces of the design simplex. In this article, extensions of the D-optimal minimal designs are developed for a general mixture model to allow additional interior points in the design space to enable prediction of the entire response surface. Also a new strategy for adding multiple interior points for symmetric mixture models is proposed. We compare the proposed designs with Cornell (1986 Cornell, J.A. (1986). A comparison between two ten-point designs for studying three-component mixture systems. J. Qual. Technol. 18(1):115.[Taylor &; Francis Online], [Web of Science ®] [Google Scholar]) two 10-point designs for the LOF test by simulations.  相似文献   

13.
This paper develops a new test for the parametric volatility function of a diffusion model based on nonparametric estimation techniques. The proposed test imposes no restriction on the functional form of the drift function and has an asymptotically standard normal distribution under the null hypothesis of correct specification. It is consistent against any fixed alternatives and has nontrivial asymptotic power against a class of local alternatives with proper rates. Monte Carlo simulations show that the test performs well in finite samples and generally has better power performance than the nonparametric test of Li (2007 Li, F. (2007). Testing the parametric specification of the diffusion function in a diffusion process. Econometric Theory 23(2):221250.[Crossref], [Web of Science ®] [Google Scholar]) and the stochastic process-based tests of Dette and Podolskij (2008 Dette, H., Podolskij, M. (2008). Testing the parametric form of the volatility in continuous time diffusion models–a stochastic process approach. Journal of Econometrics 143(1):5673.[Crossref], [Web of Science ®] [Google Scholar]). When applying the test to high frequency data of EUR/USD exchange rate, the empirical results show that the commonly used volatility functions fit more poorly when the data frequency becomes higher, and the general volatility functions fit relatively better than the constant volatility function.  相似文献   

14.
In this paper a Bayesian procedure is applied to obtain control limits for the location and scale parameters, as well as for a one-sided upper tolerance limit in the case of the two-parameter exponential distribution. An advantage of the upper tolerance limit is that it monitors the location and scale parameter at the same time. By using Jeffreys’ non-informative prior, the predictive distributions of future maximum likelihood estimators of the location and scale parameters are derived analytically. The predictive distributions are used to determine the distribution of the “run-length” and expected “run-length”. A dataset given in Krishnamoorthy and Mathew (2009 Krishnamoorthy, K., and T. Mathew. 2009. Statistical Tolerance Regions: Theory, Applications and Computation. Wiley Series in Probability and Statistics.[Crossref] [Google Scholar]) are used for illustrative purposes. The data are the mileages for some military personnel carriers that failed in service. The paper illustrates the flexibility and unique features of the Bayesian simulation method.  相似文献   

15.
In analogy with the weighted Shannon entropy proposed by Belis and Guiasu (1968 Belis, M., Guiasu, S. (1968). A quantitative-qualitative measure of information in cybernetic systems. IEEE Trans. Inf. Th. IT-4:593594.[Crossref], [Web of Science ®] [Google Scholar]) and Guiasu (1986 Guiasu, S. (1986). Grouping data by using the weighted entropy. J. Stat. Plann. Inference 15:6369.[Crossref], [Web of Science ®] [Google Scholar]), we introduce a new information measure called weighted cumulative residual entropy (WCRE). This is based on the cumulative residual entropy (CRE), which is introduced by Rao et al. (2004 Rao, M., Chen, Y., Vemuri, B.C., Wang, F. (2004). Cumulative residual entropy: a new measure of information. IEEE Trans. Info. Theory 50(6):12201228.[Crossref], [Web of Science ®] [Google Scholar]). This new information measure is “length-biased” shift dependent that assigns larger weights to larger values of random variable. The properties of WCRE and a formula relating WCRE and weighted Shannon entropy are given. Related studies of reliability theory is covered. Our results include inequalities and various bounds to the WCRE. Conditional WCRE and some of its properties are discussed. The empirical WCRE is proposed to estimate this new information measure. Finally, strong consistency and central limit theorem are provided.  相似文献   

16.
An increasing generalized failure rate of a lifetime X defines an ageing concept, denoted by IGFR. Another notion, denoted by DRPFR, is defined by the decreasingness of the reversed proportional failure rate. In this article, we provide characterizations for both IGFR and DRPFR absolutely continuous lifetimes, based on monotonicity of quotients of probabilistic functionals and a result by Nanda and Shaked (2001 Nanda, A.K., Shaked, M. (2001). The hazard rate and the reversed hazard rate orders, with applications to order statistics. Ann. Inst. Stat. Math. 53:853864.[Crossref], [Web of Science ®] [Google Scholar]). We derive the necessary conditions for the IGFR notion, based on stochastic orderings of truncated distributions, and we prove that the product of DRPFR lifetimes is also DRPFR; that the IGFR property is preserved by composition with certain risk aversion utility functions; and that the order statistics and the records (and the subsequent order statistic (record)) are IGFR under suitable assumptions, with similar results for DRPFR lifetimes. Also, we provide sufficient conditions for the hazard rate ordering of products and random products of IGFR lifetimes, and similar results for the reversed hazard rate order and DRPFR lifetimes, with a complementary result for the mean residual life order of random products of two families of IGFR lifetimes, we derive the upper and lower bounds for the cumulative distribution function of the product of IGFR lifetimes, and we provide the lower bounds for the risk function of an IGFR lifetime based on the distribution moments, and these bounds are extended for the product of IGFR lifetimes. We discuss extensively the applications of the results in insurance portfolios.  相似文献   

17.
In this article we develop a nonparametric estimator for the local average response of a censored dependent variable to endogenous regressors in a nonseparable model where the unobservable error term is not restricted to be scalar and where the nonseparable function need not be monotone in the unobservables. We formalize the identification argument put forward in Altonji, Ichimura, and Otsu (2012 Altonji, J. G., Ichimura, H., Otsu, T. (2012). Estimating derivatives in nonseparable models with limited dependent variables. Econometrica 80:17011719.[Crossref], [Web of Science ®] [Google Scholar]), construct a nonparametric estimator, characterize its asymptotic property, and conduct a Monte Carlo investigation to study its small sample properties. Identification is constructive and is achieved through a control function approach. We show that the estimator is consistent and asymptotically normally distributed. The Monte Carlo results are encouraging.  相似文献   

18.
In this paper, the adaptive estimation for varying coefficient models proposed by Chen, Wang, and Yao (2015 Chen, Y., Q. Wang, and W. Yao. 2015. Adaptive estimation for varying coefficient models. Journal of Multivariate Analysis 137:1731.[Crossref], [Web of Science ®] [Google Scholar]) is extended to allowing for nonstationary covariates. The asymptotic properties of the estimator are obtained, showing different convergence rates for the integrated covariates and stationary covariates. The nonparametric estimator of the functional coefficient with integrated covariates has a faster convergence rate than the estimator with stationary covariates, and its asymptotic distribution is mixed normal. Moreover, the adaptive estimation is more efficient than the least square estimation for non normal errors. A simulation study is conducted to illustrate our theoretical results.  相似文献   

19.
Adaptive designs find an important application in the estimation of unknown percentiles for an underlying dose-response curve. A nonparametric adaptive design was suggested by Mugno et al. (2004 Mugno, R.A., Zhus, W., Rosenberger, W.F. (2004). Adaptive urn designs for estimating several percentiles of a dose-response curve. Statist. Med. 23(13):21372150.[Crossref], [PubMed], [Web of Science ®] [Google Scholar]) to simultaneously estimate multiple percentiles of an unknown dose-response curve via generalized Polya urns. In this article, we examine the properties of the design proposed by Mugno et al. (2004 Mugno, R.A., Zhus, W., Rosenberger, W.F. (2004). Adaptive urn designs for estimating several percentiles of a dose-response curve. Statist. Med. 23(13):21372150.[Crossref], [PubMed], [Web of Science ®] [Google Scholar]) when delays in observing responses are encountered. Using simulations, we evaluate a modification of the design under varying group sizes. Our results demonstrate unbiased estimation with minimal loss in efficiency when compared to the original compound urn design.  相似文献   

20.
The complication in analyzing tumor data is that the tumors detected in a screening program tend to be slowly progressive tumors, which is the so-called left-truncated sampling that is inherent in screening studies. Under the assumption that all subjects have the same tumor growth function, Ghosh (2008 Ghosh, D. (2008). Proportional hazards regression for cancer studies. Biometrics 64:141148.[Crossref], [PubMed], [Web of Science ®] [Google Scholar]) developed estimation procedures for the Cox proportional hazards model. Shen (2011a Shen, P.-S. (2011a). Proportional hazards regression for cancer screening data. J. Stat. Comput. Simul. 18:367377.[Taylor &; Francis Online], [Web of Science ®] [Google Scholar]) demonstrated that Ghosh (2008 Ghosh, D. (2008). Proportional hazards regression for cancer studies. Biometrics 64:141148.[Crossref], [PubMed], [Web of Science ®] [Google Scholar])'s approach can be extended to the case when each subject has a specific growth function. In this article, under linear transformation model, we present a general framework to the analysis of data from cancer screening studies. We developed estimation procedures under linear transformation model, which includes Cox's model as a special case. A simulation study is conducted to demonstrate the potential usefulness of the proposed estimators.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号