共查询到20条相似文献,搜索用时 46 毫秒
1.
Luiz Antonio de Freitas 《统计学通讯:模拟与计算》2013,42(1):8-23
In this article, we consider the standard cure rate model proposed by Boag (1949) and Berkson and Gage (1952). We present a new definition of informative censoring similar to Lawless (1982) and the corresponding likelihood function. Under informative censoring, we obtain the Fisher information matrix of the exponential standard cure rate model. We verify, with simulated data, the impact caused by informative censoring in the coverage probabilities and in the lengths of asymptotic confidence intervals of the parameters of interest. An example with real data is analyzed. 相似文献
2.
The spectral measure plays a key role in the statistical modeling of multivariate extremes. Estimation of the spectral measure is a complex issue, given the need to obey a certain moment condition. We propose a Euclidean likelihood-based estimator for the spectral measure which is simple and explicitly defined, with its expression being free of Lagrange multipliers. Our estimator is shown to have the same limit distribution as the maximum empirical likelihood estimator of Einmahl and Segers (2009). Numerical experiments suggest an overall good performance and identical behavior to the maximum empirical likelihood estimator. We illustrate the method in an extreme temperature data analysis. 相似文献
3.
The log-Birnbaum-Saunders regression model introduced by Rieck and Nedelman (1991) is useful for modeling lifetimes of materials and equipments subject to different conditions. Our goal in this article is twofold. First, we numerically evaluate the finite sample performances of the likelihood ratio, score and Wald tests in the log-Birnbaum-Saunders regression model. Second, we introduce a RESET-like misspecification test for that model. The null hypothesis is that the model is correctly specified which is tested against the alternative hypothesis of model misspecification. The power of the test is evaluated using Monte Carlo simulations. Bootstrap-based inference is also considered. An empirical application is presented and discussed. 相似文献
4.
《Journal of Statistical Computation and Simulation》2012,82(3):171-181
Inference concerning the ratio of two means based two independent two-parameter gamma models with common shape parameter was examined in Booth et al. (1999) and a computationally intensive bootstrap calibration method was developed. In this paper, a likelihood based method is proposed for small sample inference about the ratio of two means of the two-parameter gamma models when the shape parameters may or may not be equal. The proposed method is very simple to use and, as illustrated in simulation studies, gives extremely accurate results. 相似文献
5.
《统计学通讯:理论与方法》2013,42(2):371-380
Palmer and Broemeling [1] compare Bayes and maximum likelihood estimates of the intraclass correlation (ICC). The prior information in their derivation of the Bayes estimator is placed on the variance components instead of the ICC itself. This paper finds a Bayes estimator of the ICC with the prior placed on the ICC. Bayes estimates based on three different priors are then compared to method of moments estimate. 相似文献
6.
Samridhi Mehta 《统计学通讯:理论与方法》2018,47(16):4021-4028
Sihm et al. (2016) proposed an unrelated question binary optional randomized response technique (RRT) model for estimating the proportion of population that possess a sensitive characteristic and the sensitivity level of the question. In our work, decision theoretic approach has been followed to obtain Bayes estimates of the two parameters along with their corresponding minimal Bayes posterior expected losses (BPEL) using beta prior and squared error loss function (SELF). Relative losses are also examined to compare the performances of the Bayes estimates with those of the classical estimates obtained by Sihm et al. (2016). The results obtained are illustrated with the help of real survey data using non informative prior. 相似文献
7.
Boardman and Kendell (1970) considered the problem of estimation with respect to Type-I censoring when an item is subjected to only one of the two causes of failure assuming exponential model. Patel and Gajjar (1992) considered extension of the Boardman and Kendell's results in case of two-stage progressive censoring. Here we have considered geometric competing risk failure model with two independent causes of failures. Maximum likelihood estimation of the parameters is carried out using Type-I two-stage progressively censored and group censored samples. Asymptotic standard errors of the estimators are obtained for both the cases. Two illustrative examples are cited for ungroup and group competing risk models. 相似文献
8.
Maximum likelihood estimation of a spatial model typically requires a sizeable computational capacity, even in relatively small samples, and becomes unfeasible in very large datasets. The unilateral approximation approach to spatial model estimation (suggested in Besag 1974) provides a viable alternative to maximum likelihood estimation that reduces substantially the computing time and the storage required. In this article, we extend the method, originally proposed for conditionally specified processes, to simultaneous and to general bilateral spatial processes over rectangular lattices. We prove the estimators’ consistency and study their finite-sample properties via Monte Carlo simulations. 相似文献
9.
Abstract The present paper focuses attention on the sensitivity of technical inefficiency to most commonly used one‐sided distributions of the inefficiency error term, namely the truncated normal, the half‐normal, and the exponential distributions. A generalized version of the half‐normal, which does not embody the zero‐mean restriction, is also explored. For each distribution, the likelihood function and the counterpart of the estimator of technical efficiency are explicitly stated (Jondrow, J., Lovell, C. A. K., Materov, I. S., Schmidt, P. ([1982]), On estimation of technical inefficiency in the stochastic frontier production function model, J. Econometrics19:233–238). Based on our panel data set, related to Tunisian manufacturing firms over the period 1983–1993, formal tests lead to a strong rejection of the zero‐mean restriction embodied in the half normal distribution. Our main conclusion is that the degree of measured inefficiency is very sensitive to the postulated assumptions about the distribution of the one‐sided error term. The estimated inefficiency indices are, however, unaffected by the choice of the functional form for the production function. 相似文献
10.
In this paper we propose a new lifetime model for multivariate survival data in presence of surviving fractions and examine some of its properties. Its genesis is based on situations in which there are m types of unobservable competing causes, where each cause is related to a time of occurrence of an event of interest. Our model is a multivariate extension of the univariate survival cure rate model proposed by Rodrigues et al. [37]. The inferential approach exploits the maximum likelihood tools. We perform a simulation study in order to verify the asymptotic properties of the maximum likelihood estimators. The simulation study also focus on size and power of the likelihood ratio test. The methodology is illustrated on a real data set on customer churn data. 相似文献
11.
Mike G. Tsionas 《统计学通讯:理论与方法》2018,47(12):3022-3028
The properties of high-dimensional Bingham distributions have been studied by Kume and Walker (2014). Fallaize and Kypraios (2016) propose the Bayesian inference for the Bingham distribution and they use developments in Bayesian computation for distributions with doubly intractable normalizing constants (Møller et al. 2006; Murray, Ghahramani, and MacKay 2006). However, they rely heavily on two Metropolis updates that they need to tune. In this article, we propose instead a model selection with the marginal likelihood. 相似文献
12.
Yong Bao 《Econometric Reviews》2018,37(4):309-324
A compact analytical representation of the asymptotic covariance matrix, in terms of model parameters directly, of the quasi maximum likelihood estimator (QMLE) is derived in autoregressive moving average (ARMA) models with possible nonzero means and non-Gaussian error terms. For model parameters excluding the error variance, it is found that the Huber (1967) sandwich form for the asymptotic covariance matrix degenerates into the inverse of the associated information matrix. In comparison to the existing result that involves the second moments of some auxiliary variables for the case of zero-mean ARMA models, the analytical asymptotic covariance in this article has an advantage in that it can be conveniently estimated by plugging in the estimated model parameters directly. 相似文献
13.
This paper is based on the application of a Bayesian model to a clinical trial study to determine a more effective treatment to lower mortality rates and consequently to increase survival times among patients with lung cancer. In this study, Qian et al. [13] strived to determine if a Weibull survival model can be used to decide whether to stop a clinical trial. The traditional Gibbs sampler was used to estimate the model parameters. This paper proposes to use the independent steady-state Gibbs sampling (ISSGS) approach, introduced by Dunbar et al. [3], to improve the original Gibbs sampler in multidimensional problems. It is demonstrated that ISSGS provides accuracy with unbiased estimation and improves the performance and convergence of the Gibbs sampler in this application. 相似文献
14.
Vinicius Fernando Calsavara Agatha Sacramento Rodrigues Vera Lúcia Damasceno Tomazella Mário de Castro 《统计学通讯:理论与方法》2017,46(19):9763-9776
In this article, we propose a flexible cure rate model, which is an extension of Cancho et al. (2011) model, by incorporating a power variance function (PVF) frailty term in latent risk. The model is more flexible in terms of dispersion and it also quantifies the unobservable heterogeneity. The parameter estimation is reached by maximum likelihood estimation procedure and Monte Carlo simulation studies are considered to evaluate the proposed model performance. The practical relevance of the model is illustrated in a real data set of preventing cancer recurrence. 相似文献
15.
Heckman's (1976, 1979) sample selection model has been employed in many studies of linear and nonlinear regression applications. It is well known that ignoring the sample selectivity may result in inconsistency of the estimator due to the correlation between the statistical errors in the selection and main equations. In this article, we reconsider the maximum likelihood estimator for the panel sample selection model in Keane et al. (1988). Since the panel data model contains individual effects, such as fixed or random effects, the likelihood function is more complicated than that of the classical Heckman model. As an alternative to the existing derivation of the likelihood function in the literature, we show that the conditional distribution of the main equation follows a closed skew-normal (CSN) distribution, of which the linear transformation is still a CSN. Although the evaluation of the likelihood function involves high-dimensional integration, we show that the integration can be further simplified into a one-dimensional problem and can be evaluated by the simulated likelihood method. Moreover, we also conduct a Monte Carlo experiment to investigate the finite sample performance of the proposed estimator and find that our estimator provides reliable and quite satisfactory results. 相似文献
16.
In this article, we consider two different shared frailty regression models under the assumption of Gompertz as baseline distribution. Mostly assumption of gamma distribution is considered for frailty distribution. To compare the results with gamma frailty model, we consider the inverse Gaussian shared frailty model also. We compare these two models to a real life bivariate survival data set of acute leukemia remission times (Freireich et al., 1963). Analysis is performed using Markov Chain Monte Carlo methods. Model comparison is made using Bayesian model selection criterion and a well-fitted model is suggested for the acute leukemia data. 相似文献
17.
In this paper, we investigate the effect of pre-smoothing on model selection. Christóbal et al 6 showed the beneficial effect of pre-smoothing on estimating the parameters in a linear regression model. Here, in a regression setting, we show that smoothing the response data prior to model selection by Akaike's information criterion can lead to an improved selection procedure. The bootstrap is used to control the magnitude of the random error structure in the smoothed data. The effect of pre-smoothing on model selection is shown in simulations. The method is illustrated in a variety of settings, including the selection of the best fractional polynomial in a generalized linear model. 相似文献
18.
Pao-Sheng Shen 《Journal of applied statistics》2011,38(10):2345-2353
Doubly truncated data appear in a number of applications, including astronomy and survival analysis. For doubly-truncated data, the lifetime T is observable only when U≤T≤V, where U and V are the left-truncated and right-truncated time, respectively. Based on the empirical likelihood approach of Zhou [21], we propose a modified EM algorithm of Turnbull [19] to construct the interval estimator of the distribution function of T. Simulation results indicate that the empirical likelihood method can be more efficient than the bootstrap method. 相似文献
19.
James R. Schott 《统计学通讯:理论与方法》2017,46(12):6112-6118
The allometric extension model is a multivariate regression model recently proposed by Tarpey and Ivey (2006). This model holds when the matrix of covariances between the variables in the response vector y and the variables in the vector of regressors x has a particular structure. In this paper, we consider tests of hypotheses for this structure when (y′, x′)′ has a multivariate normal distribution. In particular, we investigate the likelihood ratio test and a Wald test. 相似文献