首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 0 毫秒
1.
This article mainly considers interval estimation of the scale and shape parameters of the generalized exponential (GE) distribution. We adopt the generalized fiducial method to construct a kind of new confidence intervals for the parameters of interest and compare them with the frequentist and Bayesian methods. In addition, we give the comparison of the point estimation based on the frequentist, generalized fiducial and Bayesian methods. Simulation results show that a new procedure based on generalized fiducial inference is more applicable than the non-fiducial methods for the point and interval estimation of the GE distribution. Finally, two lifetime data sets are used to illustrate the application of our new procedure.  相似文献   

2.
The two-parameter generalized exponential (GE) distribution was introduced by Gupta and Kundu [Gupta, R.D. and Kundu, D., 1999, Generalized exponential distribution. Australian and New Zealand Journal of Statistics, 41(2), 173–188.]. It was observed that the GE can be used in situations where a skewed distribution for a nonnegative random variable is needed. In this article, the Bayesian estimation and prediction for the GE distribution, using informative priors, have been considered. Importance sampling is used to estimate the parameters, as well as the reliability function, and the Gibbs and Metropolis samplers data sets are used to predict the behavior of further observations from the distribution. Two data sets are used to illustrate the Bayesian procedure.  相似文献   

3.
We develop a hierarchical Bayesian approach for inference in random coefficient dynamic panel data models. Our approach allows for the initial values of each unit's process to be correlated with the unit-specific coefficients. We impose a stationarity assumption for each unit's process by assuming that the unit-specific autoregressive coefficient is drawn from a logitnormal distribution. Our method is shown to have favorable properties compared to the mean group estimator in a Monte Carlo study. We apply our approach to analyze energy and protein intakes among individuals from the Philippines.  相似文献   

4.
In many parametric problems the use of order restrictions among the parameters can lead to improved precision. Our interest is in the study of several multinomial populations under the stochastic order restriction (SOR) for univariate situations. We use Bayesian methods to show that the SOR can lead to larger gains in precision than the method without the SOR when the SOR is reasonable. Unlike frequentist order restricted inference, our methodology permits analysis even when there is uncertainty about the SOR. Our method is sampling based, and we use simple and efficient rejection sampling. The Bayes factor in favor of the SOR is computed in a simple manner, and samples from the requisite posterior distributions are easily obtained. We use real data to illustrate the procedure, and we show that there is likely to be larger gains in precision under the SOR.  相似文献   

5.
Abstract

We propose a cure rate survival model by assuming that the number of competing causes of the event of interest follows the negative binomial distribution and the time to the event of interest has the Birnbaum-Saunders distribution. Further, the new model includes as special cases some well-known cure rate models published recently. We consider a frequentist analysis for parameter estimation of the negative binomial Birnbaum-Saunders model with cure rate. Then, we derive the appropriate matrices for assessing local influence on the parameter estimates under different perturbation schemes. We illustrate the usefulness of the proposed model in the analysis of a real data set from the medical area.  相似文献   

6.
Different strategies have been proposed to improve mixing and convergence properties of Markov Chain Monte Carlo algorithms. These are mainly concerned with customizing the proposal density in the Metropolis–Hastings algorithm to the specific target density and require a detailed exploratory analysis of the stationary distribution and/or some preliminary experiments to determine an efficient proposal. Various Metropolis–Hastings algorithms have been suggested that make use of previously sampled states in defining an adaptive proposal density. Here we propose a general class of adaptive Metropolis–Hastings algorithms based on Metropolis–Hastings-within-Gibbs sampling. For the case of a one-dimensional target distribution, we present two novel algorithms using mixtures of triangular and trapezoidal densities. These can also be seen as improved versions of the all-purpose adaptive rejection Metropolis sampling (ARMS) algorithm to sample from non-logconcave univariate densities. Using various different examples, we demonstrate their properties and efficiencies and point out their advantages over ARMS and other adaptive alternatives such as the Normal Kernel Coupler.  相似文献   

7.
This paper presents an approach for constructing prediction intervals for any given distribution. The approach is based on the principle of fiducial inference. We use several examples, including the normal, binomial, exponential, gamma, and Weibull distributions, to illustrate the proposed procedure.  相似文献   

8.
Shortest prediction intervals for a future observation from the Birnbaum-Saunders distribution are obtained from both frequentist and Bayesian perspectives. Comparisons are made with alternative intervals obtained via inversion. Monte Carlo simulations are performed to assess the approximate intervals.  相似文献   

9.
In this paper, we consider three different mixture models based on the Birnbaum-Saunders (BS) distribution, viz., (1) mixture of two different BS distributions, (2) mixture of a BS distribution and a length-biased version of another BS distribution, and (3) mixture of a BS distribution and its length-biased version. For all these models, we study their characteristics including the shape of their density and hazard rate functions. For the maximum likelihood estimation of the model parameters, we use the EM algorithm. For the purpose of illustration, we analyze two data sets related to enzyme and depressive condition problems. In the case of the enzyme data, it is shown that Model 1 provides the best fit, while for the depressive condition data, it is shown all three models fit well with Model 3 providing the best fit.  相似文献   

10.
This paper considers a class of densities formed by taking the product of nonnegative polynomials and normal densities. These densities provide a rich class of distributions that can be used in modelling when faced with non-normal characteristics such as skewness and multimodality. In this paper we address inferential and computational issues arising in the practical implementation of this parametric family in the context of the linear model. Exact results are recorded for the conditional analysis of location-scale models and an importance sampling algorithm is developed for the implementation of a conditional analysis for the general linear model when using polynomial-normal distributions for the error.  相似文献   

11.
This paper presents a method for Bayesian inference for the regression parameters in a linear model with independent and identically distributed errors that does not require the specification of a parametric family of densities for the error distribution. This method first selects a nonparametric kernel density estimate of the error distribution which is unimodal and based on the least-squares residuals. Once the error distribution is selected, the Metropolis algorithm is used to obtain the marginal posterior distribution of the regression parameters. The methodology is illustrated with data sets, and its performance relative to standard Bayesian techniques is evaluated using simulation results.  相似文献   

12.
The object of the paper is to provide recipes for various fiducial inferences on a parameter under nonparametric situations. First, the fiducial empirical distribution of a random variable was introduced under nonparametric situations. And its almost sure behavior was established. Then based on it, fiducial model and hence fiducial distribution of a parameter are obtained. Further fiducial intervals of parameters as functionals of the population were constructed. Some of their frequentist properties were investigated under some mild conditions. Besides, p-values of some test hypotheses and their asymptotical properties were also given. Three applications of above results and further results were provided. For the mean, simulations on its interval estimator and hypothesis testing were conducted and their results suggest that the fiducial method performs better than others considered here.  相似文献   

13.
Shookri and Consul (1989) and Scollnik (1995) have previously considered the Bayesian analysis of an overdispersed generalized Poisson model. Scollnik (1995) also considered the Bayesian analysis of an ordinary Poisson and over-dispersed generalized Poisson mixture model. In this paper, we discuss the Bayesian analysis of these models when they are utilised in a regression context. Markov chain Monte Carlo methods are utilised, and an illustrative analysis is provided.  相似文献   

14.
Very often, the likelihoods for circular data sets are of quite complicated forms, and the functional forms of the normalising constants, which depend upon the unknown parameters, are unknown. This latter problem generally precludes rigorous, exact inference (both classical and Bayesian) for circular data.Noting the paucity of literature on Bayesian circular data analysis, and also because realistic data analysis is naturally permitted by the Bayesian paradigm, we address the above problem taking a Bayesian perspective. In particular, we propose a methodology that combines importance sampling and Markov chain Monte Carlo (MCMC) in a very effective manner to sample from the posterior distribution of the parameters, given the circular data. With simulation study and real data analysis, we demonstrate the considerable reliability and flexibility of our proposed methodology in analysing circular data.  相似文献   

15.
The Birnbaum-Saunders regression model is becoming increasingly popular in lifetime analyses and reliability studies. In this model, the signed likelihood ratio statistic provides the basis for testing inference and construction of confidence limits for a single parameter of interest. We focus on the small sample case, where the standard normal distribution gives a poor approximation to the true distribution of the statistic. We derive three adjusted signed likelihood ratio statistics that lead to very accurate inference even for very small samples. Two empirical applications are presented.  相似文献   

16.
In this paper, we consider the Bayesian inference of the unknown parameters of the randomly censored Weibull distribution. A joint conjugate prior on the model parameters does not exist; we assume that the parameters have independent gamma priors. Since closed-form expressions for the Bayes estimators cannot be obtained, we use Lindley's approximation, importance sampling and Gibbs sampling techniques to obtain the approximate Bayes estimates and the corresponding credible intervals. A simulation study is performed to observe the behaviour of the proposed estimators. A real data analysis is presented for illustrative purposes.  相似文献   

17.
The nonparametric Bayesian approach for inference regarding the unknown distribution of a random sample customarily assumes that this distribution is random and arises through Dirichlet-process mixing. Previous work within this setting has focused on the mean of the posterior distribution of this random distribution, which is the predictive distribution of a future observation given the sample. Our interest here is in learning about other features of this posterior distribution as well as about posteriors associated with functionals of the distribution of the data. We indicate how to do this in the case of linear functionals. An illustration, with a sample from a Gamma distribution, utilizes Dirichlet-process mixtures of normals to recover this distribution and its features.  相似文献   

18.
This paper presents a kernel estimation of the distribution of the scale parameter of the inverse Gaussian distribution under type II censoring together with the distribution of the remaining time. Estimation is carried out via the Gibbs sampling algorithm combined with a missing data approach. Estimates and confidence intervals for the parameters of interest are also presented.  相似文献   

19.
Summary.  We discuss a method for combining different but related longitudinal studies to improve predictive precision. The motivation is to borrow strength across clinical studies in which the same measurements are collected at different frequencies. Key features of the data are heterogeneous populations and an unbalanced design across three studies of interest. The first two studies are phase I studies with very detailed observations on a relatively small number of patients. The third study is a large phase III study with over 1500 enrolled patients, but with relatively few measurements on each patient. Patients receive different doses of several drugs in the studies, with the phase III study containing significantly less toxic treatments. Thus, the main challenges for the analysis are to accommodate heterogeneous population distributions and to formalize borrowing strength across the studies and across the various treatment levels. We describe a hierarchical extension over suitable semiparametric longitudinal data models to achieve the inferential goal. A nonparametric random-effects model accommodates the heterogeneity of the population of patients. A hierarchical extension allows borrowing strength across different studies and different levels of treatment by introducing dependence across these nonparametric random-effects distributions. Dependence is introduced by building an analysis of variance (ANOVA) like structure over the random-effects distributions for different studies and treatment combinations. Model structure and parameter interpretation are similar to standard ANOVA models. Instead of the unknown normal means as in standard ANOVA models, however, the basic objects of inference are random distributions, namely the unknown population distributions under each study. The analysis is based on a mixture of Dirichlet processes model as the underlying semiparametric model.  相似文献   

20.
In this paper, and based on a progressive type-II censored sample from the generalized Rayleigh (GR) distribution, we consider the problem of estimating the model parameters and predicting the unobserved removed data. Maximum likelihood and Bayesian approaches are used to estimate the scale and shape parameters. The Gibbs and Metropolis samplers are used to predict the life lengths of the removed units in multiple stages of the progressively censored sample. Artificial and real data analyses have been performed for illustrative purposes.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号