首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
We discuss the analysis of random effects in capture-recapture models, and outline Bayesian and frequentists approaches to their analysis. Under a normal model, random effects estimators derived from Bayesian or frequentist considerations have a common form as shrinkage estimators. We discuss some of the difficulties of analysing random effects using traditional methods, and argue that a Bayesian formulation provides a rigorous framework for dealing with these difficulties. In capture-recapture models, random effects may provide a parsimonious compromise between constant and completely time-dependent models for the parameters (e.g. survival probability). We consider application of random effects to band-recovery models, although the principles apply to more general situations, such as Cormack-Jolly-Seber models. We illustrate these ideas using a commonly analysed band recovery data set.  相似文献   

2.
In this paper, we discuss the selection of random effects within the framework of generalized linear mixed models (GLMMs). Based on a reparametrization of the covariance matrix of random effects in terms of modified Cholesky decomposition, we propose to add a shrinkage penalty term to the penalized quasi-likelihood (PQL) function of the variance components for selecting effective random effects. The shrinkage penalty term is taken as a function of the variance of random effects, initiated by the fact that if the variance is zero then the corresponding variable is no longer random (with probability one). The proposed method takes the advantage of a convenient computation for the PQL estimation and appealing properties for certain shrinkage penalty functions such as LASSO and SCAD. We propose to use a backfitting algorithm to estimate the fixed effects and variance components in GLMMs, which also selects effective random effects simultaneously. Simulation studies show that the proposed approach performs quite well in selecting effective random effects in GLMMs. Real data analysis is made using the proposed approach, too.  相似文献   

3.
A shrinkage estimation method for multinomial logit models is developed. The proposed method is based on shrinking the responses for each category towards the underlying probabilities. The estimator is also used in combination with Pregibon's resistant fitting. The resulting estimator can also be used to control the over-estimation of Pregibon's resistant estimator. The proposed method handles not only the problem of separation in multinomial logit models but estimates also exist when the number of covariates is large relative to the sample size. Estimates exist even when the MLE does not exist. Estimates can be easily computed with all commonly used statistical packages supporting the fitting procedures with weights. Estimates are compared with the usual MLE and Firth's bias reduction technique in a simulation study and an application.  相似文献   

4.
In this paper we address the problem of estimating a vector of regression parameters in the Weibull censored regression model. Our main objective is to provide natural adaptive estimators that significantly improve upon the classical procedures in the situation where some of the predictors may or may not be associated with the response. In the context of two competing Weibull censored regression models (full model and candidate submodel), we consider an adaptive shrinkage estimation strategy that shrinks the full model maximum likelihood estimate in the direction of the submodel maximum likelihood estimate. We develop the properties of these estimators using the notion of asymptotic distributional risk. The shrinkage estimators are shown to have higher efficiency than the classical estimators for a wide class of models. Further, we consider a LASSO type estimation strategy and compare the relative performance with the shrinkage estimators. Monte Carlo simulations reveal that when the true model is close to the candidate submodel, the shrinkage strategy performs better than the LASSO strategy when, and only when, there are many inactive predictors in the model. Shrinkage and LASSO strategies are applied to a real data set from Veteran's administration (VA) lung cancer study to illustrate the usefulness of the procedures in practice.  相似文献   

5.
Summary.  Empirical Bayes techniques for normal theory shrinkage estimation are extended to generalized linear models in a manner retaining the original spirit of shrinkage estimation, which is to reduce risk. The investigation identifies two classes of simple, all-purpose prior distributions, which supplement such non-informative priors as Jeffreys's prior with mechanisms for risk reduction. One new class of priors is motivated as optimizers of a core component of asymptotic risk. The methodology is evaluated in a numerical exploration and application to an existing data set.  相似文献   

6.
The capture-recapture method is applied to estimate the population size of a target population based on ascertainment data in epidemiological applications. We generalize the three-list case of Chao & Tsay (1998) to situations where more than three lists are available. An estimation procedure is presented using the concept of sample coverage, which can be interpreted as a measure of overlap information among multiple list records. When there is enough overlap, an estimator of the total population size is proposed. The bootstrap method is used to construct a variance estimator and confidence interval. If the overlap rate is relatively low, then the population size cannot be precisely estimated and thus only a lower (upper) bound is proposed for positively (negatively) dependent lists. The proposed method is applied to two data sets, one with a high and one with a low overlap rate.  相似文献   

7.
This paper discusses the problem of estimating a subset of parameters when the complementary subset is possibly redundant, in a linear regression model when the errors are generated from a long-memory process. Such a model arises due to the overmodelling of a situation involving long-memory data. Along with the classical least-squares estimator and restricted least-squares estimator, preliminary test least-squares estimator and shrinkage least-squares estimator are investigated in an asymptotic set-up and their relative performances are studied under contiguous alternatives. The contiguous alternatives under such dependence are fundamentally different from those under the independent errors case.  相似文献   

8.
In this paper, we consider the shrinkage and penalty estimation procedures in the linear regression model with autoregressive errors of order p when it is conjectured that some of the regression parameters are inactive. We develop the statistical properties of the shrinkage estimation method including asymptotic distributional biases and risks. We show that the shrinkage estimators have a significantly higher relative efficiency than the classical estimator. Furthermore, we consider the two penalty estimators: least absolute shrinkage and selection operator (LASSO) and adaptive LASSO estimators, and numerically compare their relative performance with that of the shrinkage estimators. A Monte Carlo simulation experiment is conducted for different combinations of inactive predictors and the performance of each estimator is evaluated in terms of the simulated mean-squared error. This study shows that the shrinkage estimators are comparable to the penalty estimators when the number of inactive predictors in the model is relatively large. The shrinkage and penalty methods are applied to a real data set to illustrate the usefulness of the procedures in practice.  相似文献   

9.
Biomarkers play a key role in the monitoring of disease progression. The time taken for an individual to reach a biomarker exceeding or lower than a meaningful threshold is often of interest. Due to the inherent variability of biomarkers, persistence criteria are sometimes included in the definitions of progression, such that only two consecutive measurements above or below the relevant threshold signal that “true” progression has occurred. In previous work, a novel approach was developed, which allowed estimation of the time to threshold using the parameters from a linear mixed model where the residual variance was assumed to be pure measurement error. In this paper, we extend this methodology so that serial correlation can be accommodated. Assuming that the Markov property holds and applying the chain rule of probabilities, we found that the probability of progression at each timepoint can be expressed simply as the product of conditional probabilities. The methodology is applied to a cohort of HIV positive individuals, where the time to reach a CD4 count threshold is estimated. The second application we present is based on a study on abdominal aortic aneurysms, where the time taken for an individual to reach a diameter exceeding 55 mm is studied. We observed that erroneously ignoring the residual correlation when it is strong may result in substantial overestimation of the time to threshold. The estimated probability of the biomarker reaching a threshold of interest, expected time to threshold, and confidence intervals are presented for selected patients in both applications.  相似文献   

10.
Random effects models are considered for count data obtained in a cross or nested classification. The main feature of the proposed models is the use of the additive effects on the original scale in contrast to the commonly used log scale. The rationale behind this approach is given. The estimation of variance components is based on the usual mean square approach. Directly analogous results to those from the analysis of variance models for continuous data are obtained. The usual Poisson dispersion test procedure can be used not only to test for no overall random effects but also to assess the adequacy of the model. Individual variance component can be tested by using the usual F-test. To get a reliable estimate, a large number of factor levels seem to be required.  相似文献   

11.
We consider the Arnason-Schwarz model, usually used to estimate survival and movement probabilities from capture-recapture data. A missing data structure of this model is constructed which allows a clear separation of information relative to capture and relative to movement. Extensions of the Arnason-Schwarz model are considered. For example, we consider a model that takes into account both the individual migration history and the individual reproduction history. Biological assumptions of these extensions are summarized via a directed graph. Owing to missing data, the posterior distribution of parameters is numerically intractable. To overcome those computational difficulties we advocate a Gibbs sampling algorithm that takes advantage of the missing data structure inherent in capture-recapture models. Prior information on survival, capture and movement probabilities typically consists of a prior mean and of a prior 95% credible confidence interval. Dirichlet distributions are used to incorporate some prior information on capture, survival probabilities, and movement probabilities. Finally, the influence of the prior on the Bayesian estimates of movement probabilities is examined.  相似文献   

12.
The contribution investigates the problem of estimating the size of a population, also known as the missing cases problem. Suppose a registration system is targeting to identify all cases having a certain characteristic such as a specific disease (cancer, heart disease, ...), disease related condition (HIV, heroin use, ...) or a specific behavior (driving a car without license). Every case in such a registration system has a certain notification history in that it might have been identified several times (at least once) which can be understood as a particular capture-recapture situation. Typically, cases are left out which have never been listed at any occasion, and it is this frequency one wants to estimate. In this paper modelling is concentrating on the counting distribution, e.g. the distribution of the variable that counts how often a given case has been identified by the registration system. Besides very simple models like the binomial or Poisson distribution, finite (nonparametric) mixtures of these are considered providing rather flexible modelling tools. Estimation is done using maximum likelihood by means of the EM algorithm. A case study on heroin users in Bangkok in the year 2001 is completing the contribution.  相似文献   

13.
We consider the Arnason-Schwarz model, usually used to estimate survival and movement probabilities from capture-recapture data. A missing data structure of this model is constructed which allows a clear separation of information relative to capture and relative to movement. Extensions of the Arnason-Schwarz model are considered. For example, we consider a model that takes into account both the individual migration history and the individual reproduction history. Biological assumptions of these extensions are summarized via a directed graph. Owing to missing data, the posterior distribution of parameters is numerically intractable. To overcome those computational difficulties we advocate a Gibbs sampling algorithm that takes advantage of the missing data structure inherent in capture-recapture models. Prior information on survival, capture and movement probabilities typically consists of a prior mean and of a prior 95% credible confidence interval. Dirichlet distributions are used to incorporate some prior information on capture, survival probabilities, and movement probabilities. Finally, the influence of the prior on the Bayesian estimates of movement probabilities is examined.  相似文献   

14.
We present a rank based method for obtaining point and interval estimates of a scale version of the intraclass correlation coefficient in a one-way random effects model. When compared to the method of Arvesen and Schmitz (1970), the new method is not only applicable to a broader class of situations, but also much easier to implement. Results of a simulation study indicate that the new procedure compares favorably with the Arvesen-Schmitz procedure and the classical normal theory procedure especially If the random components have heavy tailed distributions.  相似文献   

15.
Patient flow modeling is a growing field of interest in health services research. Several techniques have been applied to model movement of patients within and between health-care facilities. However, individual patient experience during the delivery of care has always been overlooked. In this work, a random effects model is introduced to patient flow modeling and applied to a London Hospital Neonatal unit data. In particular, a random effects multinomial logit model is used to capture individual patient trajectories in the process of care with patient frailties modeled as random effects. Intuitively, both operational and clinical patient flow are modeled, the former being physical and the latter latent. Two variants of the model are proposed, one based on mere patient pathways and the other based on patient characteristics. Our technique could identify interesting pathways such as those that result in high probability of death (survival), pathways incurring the least (highest) cost of care or pathways with the least (highest) length of stay. Patient-specific discharge probabilities from the health care system could also be predicted. These are of interest to health-care managers in planning the scarce resources needed to run health-care institutions.  相似文献   

16.
Nelder and Wedderburn (1972) gave a practical fitting procedure that encompassed a more gencral family of data distributions than the Gaussian distribution and provided an easily understood conceptual framework. In extending the framework to more than one error structure the technical difficulties of the fitting procedure have tended to cloud the concepts. Here we show that a simple extension to the fitting procedure is possible and thus pave the way for a fuller examimtion of mixed effects models in generalized linear model distributions. It is clear that we should not, and do not have to, confine ourselves to fitting random effects using the Gaussian distribiition. In addition, in, some quite general mixing distribution problems the application of the EM algorithm to the complete data likelihood leads to iterative schemes that maximize the marginal likelihood of the observed data variable.  相似文献   

17.
In this paper, we investigate the commonality of nonparametric component functions among different quantile levels in additive regression models. We propose two fused adaptive group Least Absolute Shrinkage and Selection Operator penalties to shrink the difference of functions between neighbouring quantile levels. The proposed methodology is able to simultaneously estimate the nonparametric functions and identify the quantile regions where functions are unvarying, and thus is expected to perform better than standard additive quantile regression when there exists a region of quantile levels on which the functions are unvarying. Under some regularity conditions, the proposed penalised estimators can theoretically achieve the optimal rate of convergence and identify the true varying/unvarying regions consistently. Simulation studies and a real data application show that the proposed methods yield good numerical results.  相似文献   

18.
Two-stage procedures are introduced to control the width and coverage (validity) of confidence intervals for the estimation of the mean, the between groups variance component and certain ratios of the variance components in one-way random effects models. The procedures use the pilot sample data to estimate an “optimal” group size and then proceed to determine the number of groups by a stopping rule. Such sampling plans give rise to unbalanced data, which are consequently analyzed by the harmonic mean method. Several asymptotic results concerning the proposed procedures are given along with simulation results to assess their performance in moderate sample size situations. The proposed procedures were found to effectively control the width and probability of coverage of the resulting confidence intervals in all cases and were also found to be robust in the presence of missing observations. From a practical point of view, the procedures are illustrated using a real data set and it is shown that the resulting unbalanced designs tend to require smaller sample sizes than is needed in a corresponding balanced design where the group size is arbitrarily pre-specified.  相似文献   

19.
This paper develops a likelihood-based inference procedure for continuous-time capture-recapture models. The first-capture and recapture intensities are assumed to be in constant proportion but may otherwise vary arbitrarily through time. The full likelihood is partitioned into two factors, one of which is analogous to the likelihood in a special type of multiplicative intensity model arising in failure time analysis. The remaining factor is free of the non-parametric nuisance parameter and is easily maximized. This factor provides an estimator of population size and an asymptotic variance under a counting process framework. The resulting estimation procedure is shown to be equivalent to that derived from a martingale-based estimating function approach. Simulation results are presented to examine the performance of the proposed estimators.  相似文献   

20.
Statistical Methods & Applications - We consider a re-sampling scheme for estimation of the population parameters in the mixed-effects nonlinear regression models of the type used, for example,...  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号