首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
The approach of Bayesian mixed effects modeling is an appropriate method for estimating both population-specific as well as subject-specific times to steady state. In addition to pure estimation, the approach allows to determine the time until a certain fraction of individuals of a population has reached steady state with a pre-specified certainty. In this paper a mixed effects model for the parameters of a nonlinear pharmacokinetic model is used within a Bayesian framework. Model fitting by means of Markov Chain Monte Carlo methods as implemented in the Gibbs sampler as well as the extraction of estimates and probability statements of interest are described. Finally, the proposed approach is illustrated by application to trough data from a multiple dose clinical trial.  相似文献   

2.
We present a scalable Bayesian modelling approach for identifying brain regions that respond to a certain stimulus and use them to classify subjects. More specifically, we deal with multi‐subject electroencephalography (EEG) data with a binary response distinguishing between alcoholic and control groups. The covariates are matrix‐variate with measurements taken from each subject at different locations across multiple time points. EEG data have a complex structure with both spatial and temporal attributes. We use a divide‐and‐conquer strategy and build separate local models, that is, one model at each time point. We employ Bayesian variable selection approaches using a structured continuous spike‐and‐slab prior to identify the locations that respond to a certain stimulus. We incorporate the spatio‐temporal structure through a Kronecker product of the spatial and temporal correlation matrices. We develop a highly scalable estimation algorithm, using likelihood approximation, to deal with large number of parameters in the model. Variable selection is done via clustering of the locations based on their duration of activation. We use scoring rules to evaluate the prediction performance. Simulation studies demonstrate the efficiency of our scalable algorithm in terms of estimation and fast computation. We present results using our scalable approach on a case study of multi‐subject EEG data.  相似文献   

3.
University courses in elementary statistics are usually taught from a frequentist perspective. In this paper I suggest how such courses can be taught using a Bayesian approach, and I indicate why beginning students are well served by a Bayesian course. A principal focus of any good elementary course is the application of statistics to real and important scientific problems. The Bayesian approach fits neatly with a scientific focus. Bayesians take a larger view, and one not limited to data analysis. In particular, the Bayesian approach is subjective, and requires assessing prior probabilities. This requirement forces users to relate current experimental evidence to other available information–-including previous experiments of a related nature, where “related” is judged subjectively. I discuss difficulties faced by instructors and students in elementary Bayesian courses, and provide a sample syllabus for an elementary Bayesian course.  相似文献   

4.
The study of HIV dynamics is one of the most important developments in recent AIDS research. It has led to a new understanding of the pathogenesis of HIV infection. Although important findings in HIV dynamics have been published in prestigious scientific journals, the statistical methods for parameter estimation and model-fitting used in those papers appear surprisingly crude and have not been studied in more detail. For example, the unidentifiable parameters were simply imputed by mean estimates from previous studies, and important pharmacological/clinical factors were not considered in the modelling. In this paper, a viral dynamic model is developed to evaluate the effect of pharmacokinetic variation, drug resistance and adherence on antiviral responses. In the context of this model, we investigate a Bayesian modelling approach under a non-linear mixed-effects (NLME) model framework. In particular, our modelling strategy allows us to estimate time-varying antiviral efficacy of a regimen during the whole course of a treatment period by incorporating the information of drug exposure and drug susceptibility. Both simulated and real clinical data examples are given to illustrate the proposed approach. The Bayesian approach has great potential to be used in many aspects of viral dynamics modelling since it allow us to fit complex dynamic models and identify all the model parameters. Our results suggest that Bayesian approach for estimating parameters in HIV dynamic models is flexible and powerful.  相似文献   

5.
In the Bayesian approach, the Behrens–Fisher problem has been posed as one of estimation for the difference of two means. No Bayesian solution to the Behrens–Fisher testing problem has yet been given due, perhaps, to the fact that the conventional priors used are improper. While default Bayesian analysis can be carried out for estimation purposes, it poses difficulties for testing problems. This paper generates sensible intrinsic and fractional prior distributions for the Behrens–Fisher testing problem from the improper priors commonly used for estimation. It allows us to compute the Bayes factor to compare the null and the alternative hypotheses. This default procedure of model selection is compared with a frequentist test and the Bayesian information criterion. We find discrepancy in the sense that frequentist and Bayesian information criterion reject the null hypothesis for data, that the Bayes factor for intrinsic or fractional priors do not.  相似文献   

6.
We generalize Wedderburn's (1974) notion of quasi-likelihood to define a quasi-Bayesian approach for nonlinear estimation problems by allowing the full distributional assumptions about the random component in the classical Bayesian approach to be replaced by much weaker assumptions in which only the first and second moments of the prior distribution are specified. The formulas given are based on the Gauss-Newton estimating procedure and require only the first and second moments of the distributions involved. The use of GLIM package to solve for the estimation problems considered is discussed. Applications are made to estimation problems in inverse linear regression, regression models with both variables subject to error and also to the estimation of the size of animal populations. Some numerical illustrations are reported. For the inverse linear regression problem, comparisons with ordinary Bayesianand other techniques are considered.  相似文献   

7.
Extropy, a complementary dual of entropy, is considered in this paper. A Bayesian approach based on the Dirichlet process is proposed for the estimation of extropy. A goodness of fit test is also developed. Many theoretical properties of the procedure are derived. Several examples are discussed to illustrate the approach.  相似文献   

8.
In this article, we propose a Bayesian approach to estimate the multiple structural change-points in a level and the trend when the number of change-points is unknown. Our formulation of the structural-change model involves a binary discrete variable that indicates the structural change. The determination of the number and the form of structural changes are considered as a model selection issue in Bayesian structural-change analysis. We apply an advanced Monte Carlo algorithm, the stochastic approximation Monte Carlo (SAMC) algorithm, to this structural-change model selection issue. SAMC effectively functions for the complex structural-change model estimation, since it prevents entrapment in local posterior mode. The estimation of the model parameters in each regime is made using the Gibbs sampler after each change-point is detected. The performance of our proposed method has been investigated on simulated and real data sets, a long time series of US real gross domestic product, US uses of force between 1870 and 1994 and 1-year time series of temperature in Seoul, South Korea.  相似文献   

9.
The variational approach to Bayesian inference enables simultaneous estimation of model parameters and model complexity. An interesting feature of this approach is that it also leads to an automatic choice of model complexity. Empirical results from the analysis of hidden Markov models with Gaussian observation densities illustrate this. If the variational algorithm is initialized with a large number of hidden states, redundant states are eliminated as the method converges to a solution, thereby leading to a selection of the number of hidden states. In addition, through the use of a variational approximation, the deviance information criterion for Bayesian model selection can be extended to the hidden Markov model framework. Calculation of the deviance information criterion provides a further tool for model selection, which can be used in conjunction with the variational approach.  相似文献   

10.
Pharmacokinetic studies are commonly performed using the two-stage approach. The first stage involves estimation of pharmacokinetic parameters such as the area under the concentration versus time curve (AUC) for each analysis subject separately, and the second stage uses the individual parameter estimates for statistical inference. This two-stage approach is not applicable in sparse sampling situations where only one sample is available per analysis subject similar to that in non-clinical in vivo studies. In a serial sampling design, only one sample is taken from each analysis subject. A simulation study was carried out to assess coverage, power and type I error of seven methods to construct two-sided 90% confidence intervals for ratios of two AUCs assessed in a serial sampling design, which can be used to assess bioequivalence in this parameter.  相似文献   

11.
A new variational Bayesian (VB) algorithm, split and eliminate VB (SEVB), for modeling data via a Gaussian mixture model (GMM) is developed. This new algorithm makes use of component splitting in a way that is more appropriate for analyzing a large number of highly heterogeneous spiky spatial patterns with weak prior information than existing VB-based approaches. SEVB is a highly computationally efficient approach to Bayesian inference and like any VB-based algorithm it can perform model selection and parameter value estimation simultaneously. A significant feature of our algorithm is that the fitted number of components is not limited by the initial proposal giving increased modeling flexibility. We introduce two types of split operation in addition to proposing a new goodness-of-fit measure for evaluating mixture models. We evaluate their usefulness through empirical studies. In addition, we illustrate the utility of our new approach in an application on modeling human mobility patterns. This application involves large volumes of highly heterogeneous spiky data; it is difficult to model this type of data well using the standard VB approach as it is too restrictive and lacking in the flexibility required. Empirical results suggest that our algorithm has also improved upon the goodness-of-fit that would have been achieved using the standard VB method, and that it is also more robust to various initialization settings.  相似文献   

12.
This paper describes inference methods for functional data under the assumption that the functional data of interest are smooth latent functions, characterized by a Gaussian process, which have been observed with noise over a finite set of time points. The methods we propose are completely specified in a Bayesian environment that allows for all inferences to be performed through a simple Gibbs sampler. Our main focus is in estimating and describing uncertainty in the covariance function. However, these models also encompass functional data estimation, functional regression where the predictors are latent functions, and an automatic approach to smoothing parameter selection. Furthermore, these models require minimal assumptions on the data structure as the time points for observations do not need to be equally spaced, the number and placement of observations are allowed to vary among functions, and special treatment is not required when the number of functional observations is less than the dimensionality of those observations. We illustrate the effectiveness of these models in estimating latent functional data, capturing variation in the functional covariance estimate, and in selecting appropriate smoothing parameters in both a simulation study and a regression analysis of medfly fertility data.  相似文献   

13.
This paper studies a functional coe?cient time series model with trending regressors, where the coe?cients are unknown functions of time and random variables. We propose a local linear estimation method to estimate the unknown coe?cient functions, and establish the corresponding asymptotic theory under mild conditions. We also develop a test procedure to see if the functional coe?cients take particular parametric forms. For practical use, we further propose a Bayesian approach to select the bandwidths, and conduct several numerical experiments to examine the finite sample performance of our proposed local linear estimator and the test procedure. The results show that the local linear estimator works well and the proposed test has satisfactory size and power. In addition, our simulation studies show that the Bayesian bandwidth selection method performs better than the cross-validation method. Furthermore, we use the functional coe?cient model to study the relationship between consumption per capita and income per capita in United States, and it was shown that the functional coe?cient model with our proposed local linear estimator and Bayesian bandwidth selection method performs well in both in-sample fitting and out-of-sample forecasting.  相似文献   

14.
For nearly any challenging scientific problem evaluation of the likelihood is problematic if not impossible. Approximate Bayesian computation (ABC) allows us to employ the whole Bayesian formalism to problems where we can use simulations from a model, but cannot evaluate the likelihood directly. When summary statistics of real and simulated data are compared??rather than the data directly??information is lost, unless the summary statistics are sufficient. Sufficient statistics are, however, not common but without them statistical inference in ABC inferences are to be considered with caution. Previously other authors have attempted to combine different statistics in order to construct (approximately) sufficient statistics using search and information heuristics. Here we employ an information-theoretical framework that can be used to construct appropriate (approximately sufficient) statistics by combining different statistics until the loss of information is minimized. We start from a potentially large number of different statistics and choose the smallest set that captures (nearly) the same information as the complete set. We then demonstrate that such sets of statistics can be constructed for both parameter estimation and model selection problems, and we apply our approach to a range of illustrative and real-world model selection problems.  相似文献   

15.
Traditionally, time series analysis involves building an appropriate model and using either parametric or nonparametric methods to make inference about the model parameters. Motivated by recent developments for dimension reduction in time series, an empirical application of sufficient dimension reduction (SDR) to nonlinear time series modelling is shown in this article. Here, we use time series central subspace as a tool for SDR and estimate it using mutual information index. Especially, in order to reduce the computational complexity in time series, we propose an efficient estimation method of minimal dimension and lag using a modified Schwarz–Bayesian criterion, when either of the dimensions and the lags is unknown. Through simulations and real data analysis, the approach presented in this article performs well in autoregression and volatility estimation.  相似文献   

16.
Investigators who manage multicenter clinical trials need to pay careful attention to patterns of subject accrual, and the prediction of activation time for pending centers is potentially crucial for subject accrual prediction. We propose a Bayesian hierarchical model to predict subject accrual for multicenter clinical trials in which center activation times vary. We define center activation time as the time at which a center can begin enrolling patients in the trial. The difference in activation times between centers is assumed to follow an exponential distribution, and the model of subject accrual integrates prior information for the study with actual enrollment progress. We apply our proposed Bayesian multicenter accrual model to two multicenter clinical studies. The first is the PAIN‐CONTRoLS study, a multicenter clinical trial with a goal of activating 40 centers and enrolling 400 patients within 104 weeks. The second is the HOBIT trial, a multicenter clinical trial with a goal of activating 14 centers and enrolling 200 subjects within 36 months. In summary, the Bayesian multicenter accrual model provides a prediction of subject accrual while accounting for both center‐ and individual patient‐level variation.  相似文献   

17.
In clinical trials, continuous monitoring of event incidence rate plays a critical role in making timely decisions affecting trial outcome. For example, continuous monitoring of adverse events protects the safety of trial participants, while continuous monitoring of efficacy events helps identify early signals of efficacy or futility. Because the endpoint of interest is often the event incidence associated with a given length of treatment duration (e.g., incidence proportion of an adverse event with 2 years of dosing), assessing the event proportion before reaching the intended treatment duration becomes challenging, especially when the event onset profile evolves over time with accumulated exposure. In particular, in the earlier part of the study, ignoring censored subjects may result in significant bias in estimating the cumulative event incidence rate. Such a problem is addressed using a predictive approach in the Bayesian framework. In the proposed approach, experts' prior knowledge about both the frequency and timing of the event occurrence is combined with observed data. More specifically, during any interim look, each event‐free subject will be counted with a probability that is derived using prior knowledge. The proposed approach is particularly useful in early stage studies for signal detection based on limited information. But it can also be used as a tool for safety monitoring (e.g., data monitoring committee) during later stage trials. Application of the approach is illustrated using a case study where the incidence rate of an adverse event is continuously monitored during an Alzheimer's disease clinical trial. The performance of the proposed approach is also assessed and compared with other Bayesian and frequentist methods via simulation. Copyright © 2015 John Wiley & Sons, Ltd.  相似文献   

18.
This article describes a full Bayesian treatment for simultaneous fixed-effect selection and parameter estimation in high-dimensional generalized linear mixed models. The approach consists of using a Bayesian adaptive Lasso penalty for signal-level adaptive shrinkage and a fast Variational Bayes scheme for estimating the posterior mode of the coefficients. The proposed approach offers several advantages over the existing methods, for example, the adaptive shrinkage parameters are automatically incorporated, no Laplace approximation step is required to integrate out the random effects. The performance of our approach is illustrated on several simulated and real data examples. The algorithm is implemented in the R package glmmvb and is made available online.  相似文献   

19.
The varying coefficient model (VCM) is an important generalization of the linear regression model and many existing estimation procedures for VCM were built on L 2 loss, which is popular for its mathematical beauty but is not robust to non-normal errors and outliers. In this paper, we address the problem of both robustness and efficiency of estimation and variable selection procedure based on the convex combined loss of L 1 and L 2 instead of only quadratic loss for VCM. By using local linear modeling method, the asymptotic normality of estimation is driven and a useful selection method is proposed for the weight of composite L 1 and L 2. Then the variable selection procedure is given by combining local kernel smoothing with adaptive group LASSO. With appropriate selection of tuning parameters by Bayesian information criterion (BIC) the theoretical properties of the new procedure, including consistency in variable selection and the oracle property in estimation, are established. The finite sample performance of the new method is investigated through simulation studies and the analysis of body fat data. Numerical studies show that the new method is better than or at least as well as the least square-based method in terms of both robustness and efficiency for variable selection.  相似文献   

20.
Copula, marginal distributions and model selection: a Bayesian note   总被引:3,自引:0,他引:3  
Copula functions and marginal distributions are combined to produce multivariate distributions. We show advantages of estimating all parameters of these models using the Bayesian approach, which can be done with standard Markov chain Monte Carlo algorithms. Deviance-based model selection criteria are also discussed when applied to copula models since they are invariant under monotone increasing transformations of the marginals. We focus on the deviance information criterion. The joint estimation takes into account all dependence structure of the parameters’ posterior distributions in our chosen model selection criteria. Two Monte Carlo studies are conducted to show that model identification improves when the model parameters are jointly estimated. We study the Bayesian estimation of all unknown quantities at once considering bivariate copula functions and three known marginal distributions.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号