排序方式: 共有15条查询结果,搜索用时 15 毫秒
1.
In this paper, asymptotic relative efficiency (ARE) of Wald tests for the Tweedie class of models with log-linear mean, is considered when the aux¬iliary variable is measured with error. Wald test statistics based on the naive maximum likelihood estimator and on a consistent estimator which is obtained by using Nakarnura's (1990) corrected score function approach are defined. As shown analytically, the Wald statistics based on the naive and corrected score function estimators are asymptotically equivalents in terms of ARE. On the other hand, the asymptotic relative efficiency of the naive and corrected Wald statistic with respect to the Wald statistic based on the true covariate equals to the square of the correlation between the unobserved and the observed co-variate. A small scale numerical Monte Carlo study and an example illustrate the small sample size situation. 相似文献
2.
One of the greatest challenges related to the use of piecewise exponential models (PEMs) is to find an adequate grid of time-points needed in its construction. In general, the number of intervals in such a grid and the position of their endpoints are ad-hoc choices. We extend previous works by introducing a full Bayesian approach for the piecewise exponential model in which the grid of time-points (and, consequently, the endpoints and the number of intervals) is random. We estimate the failure rates using the proposed procedure and compare the results with the non-parametric piecewise exponential estimates. Estimates for the survival function using the most probable partition are compared with the Kaplan-Meier estimators (KMEs). A sensitivity analysis for the proposed model is provided considering different prior specifications for the failure rates and for the grid. We also evaluate the effect of different percentage of censoring observations in the estimates. An application to a real data set is also provided. We notice that the posteriors are strongly influenced by prior specifications, mainly for the failure rates parameters. Thus, the priors must be fairly built, say, really disclosing the expert prior opinion. 相似文献
3.
Fabio N. Demarqui Rosangela H. LoschiDipak K. Dey Enrico A. Colosimo 《Journal of statistical planning and inference》2012,142(3):728-742
A novel fully Bayesian approach for modeling survival data with explanatory variables using the Piecewise Exponential Model (PEM) with random time grid is proposed. We consider a class of correlated Gamma prior distributions for the failure rates. Such prior specification is obtained via the dynamic generalized modeling approach jointly with a random time grid for the PEM. A product distribution is considered for modeling the prior uncertainty about the random time grid, turning possible the use of the structure of the Product Partition Model (PPM) to handle the problem. A unifying notation for the construction of the likelihood function of the PEM, suitable for both static and dynamic modeling approaches, is considered. Procedures to evaluate the performance of the proposed model are provided. Two case studies are presented in order to exemplify the methodology. For comparison purposes, the data sets are also fitted using the dynamic model with fixed time grid established in the literature. The results show the superiority of the proposed model. 相似文献
4.
Liciana V. A. Silveira Enrico A. Colosimo José Raimundo de S. Passos 《统计学通讯:理论与方法》2013,42(15):2659-2666
It is common to have experiments in which it is not possible to observe the exact lifetimes but only the interval where they occur. This sort of data presents a high number of ties and it is called grouped or interval-censored survival data. Regression methods for grouped data are available in the statistical literature. The regression structure considers modeling the probability of a subject's survival past a visit time conditional on his survival at the previous visit. Two approaches are presented: assuming that lifetimes come from (1) a continuous proportional hazards model and (2) a logistic model. However, there may be situations in which none of the models are adequate for a particular data set. This article proposes the generalized log-normal model as an alternative model for discrete survival data. This model was introduced by Chen (1995) and it is extended in this article for grouped survival data. A real example related to a Chagas disease illustrates the proposed model. 相似文献
5.
We consider the problem of adjusting a machine that manufactures parts in batches or lots and experiences random offsets or shifts whenever a set-up operation takes place between lots. The existing procedures for adjusting set-up errors in a production process over a set of lots are based on the assumption of known process parameters. In practice, these parameters are usually unknown, especially in short-run production. Due to this lack of knowledge, adjustment procedures such as Grubbs' (1954, 1983) rules and discrete integral controllers (also called EWMA controllers) aimed at adjusting for the initial offset in each single lot, are typically used. This paper presents an approach for adjusting the initial machine offset over a set of lots when the process parameters are unknown and are iteratively estimated using Markov Chain Monte Carlo (MCMC). As each observation becomes available, a Gibbs Sampler is run to estimate the parameters of a hierarchical normal means model given the observations up to that point in time. The current lot mean estimate is then used for adjustment. If used over a series of lots, the proposed method allows one eventually to start adjusting the offset before producing the first part in each lot. The method is illustrated with application to two examples reported in the literature. It is shown how the proposed MCMC adjusting procedure can outperform existing rules based on a quadratic off-target criterion. 相似文献
6.
Almeida Frederico Machado Colosimo Enrico Antônio Mayrink Vinícius Diniz 《Lifetime data analysis》2021,27(1):131-155
Lifetime Data Analysis - Models for situations where some individuals are long-term survivors, immune or non-susceptible to the event of interest, are extensively studied in biomedical research.... 相似文献
7.
In some survival studies, the exact time of the event of interest is unknown, but the event is known to have occurred during a particular period of time (interval-censored data). If the diagnostic tool used to detect the event of interest is not perfectly sensitive and specific, outcomes may be mismeasured; a healthy subject may be diagnosed as sick and a sick one may be diagnosed as healthy. In such cases, traditional survival analysis methods produce biased estimates for the time-to-failure distribution parameters (Paggiaro and Torelli 2004). In this context, we developed a parametric model that incorporates sensitivity and specificity into a grouped survival data analysis (a case of interval-censored data in which all subjects are tested at the same predetermined time points). Inferential aspects and properties of the methodology, such as the likelihood function and identifiability, are discussed in this article. Assuming known and non differential misclassification, Monte Carlo simulations showed that the proposed model performed well in the case of mismeasured outcomes; the estimates of the relative bias of the model were lower than those provided by the naive method that assumes perfect sensitivity and specificity. The proposed methodology is illustrated by a study related to mango tree lifetimes. 相似文献
8.
Enrico A. Colosimo Gustavo L. Gilardoni Wagner B. Santos Sergio B. Motta 《统计学通讯:理论与方法》2013,42(7):1289-1298
Determination of preventive maintenance is an important issue for systems under degradation. A typical maintenance policy calls for complete preventive repair actions at pre-scheduled times and minimal repair actions whenever a failure occurs. Under minimal repair, failures are modeled according to a non homogeneous Poisson process. A perfect preventive maintenance restores the system to the as good as new condition. The motivation for this article was a maintenance data set related to power switch disconnectors. Two different types of failures could be observed for these systems according to their causes. The major difference between these types of failures is their costs. Assuming that the system will be in operation for an infinite time, we find the expected cost per unit of time for each preventive maintenance policy and hence obtain the optimal strategy as a function of the processes intensities. Assuming a parametrical form for the intensity function, large sample estimates for the optimal maintenance check points are obtained and discussed. 相似文献
9.
Asymptotic relative efficiency of score tests in Weibull models with measurement errors 总被引:1,自引:0,他引:1
The main object of this paper is to discuss properties of the score statistics for testing the null hypothesis of no association
in Weibull model with measurement errors. Three different score statistics are considered. The efficient score statistics,
a naive score statistics obtained by replacing the unobserved true covariate with the observed one and a score statistics
based on the corrected score statistics. It is shown that corrected and naive score statistics are equivalent and the asymptotic
relative efficiency between naive and efficient score statistics is derived. 相似文献
10.
Maristela Dias de Oliveira Enrico A. Colosimo Gustavo L. Gilardoni 《Journal of statistical planning and inference》2012,142(5):1151-1160
Statistical models for recurrent events are of great interest in repairable systems reliability and maintenance. The adopted model under minimal repair maintenance is frequently a nonhomogeneous Poisson process with the power law process (PLP) intensity function. Although inference for the PLP is generally based on maximum likelihood theory, some advantages of the Bayesian approach have been reported in the literature. In this paper it is proposed that the PLP intensity be reparametrized in terms of (β,η), where β is the elasticity of the mean number of events with respect to time and η is the mean number of events for the period in which the system was actually observed. It is shown that β and η are orthogonal and that the likelihood becomes proportional to a product of gamma densities. Therefore, the family of natural conjugate priors is also a product of gammas. The idea is extended to the case that several realizations of the same PLP are observed along overlapping periods of time. Some Monte Carlo simulations are provided to study the frequentist behavior of the Bayesian estimates and to compare them with the maximum likelihood estimates. The results are applied to a real problem concerning the determination of the optimal periodicity of preventive maintenance for a set of power transformers. Prior distributions are elicited for β and η based on their operational interpretation and engineering expertise. 相似文献