首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   3篇
  免费   0篇
统计学   3篇
  2013年   1篇
  2009年   1篇
  2008年   1篇
排序方式: 共有3条查询结果,搜索用时 0 毫秒
1
1.
Summary.  The reciprocal of serum creatinine concentration, RC, is often used as a biomarker to monitor renal function. It has been observed that RC trajectories remain relatively stable after transplantation until a certain moment, when an irreversible decrease in the RC levels occurs. This decreasing trend commonly precedes failure of a graft. Two subsets of individuals can be distinguished according to their RC trajectories: a subset of individuals having stable RC levels and a subset of individuals who present an irrevocable decrease in their RC levels. To describe such data, the paper proposes a joint latent class model for longitudinal and survival data with two latent classes. RC trajectories within latent class one are modelled by an intercept-only random-effects model and RC trajectories within latent class two are modelled by a segmented random changepoint model. A Bayesian approach is used to fit this joint model to data from patients who had their first kidney transplantation in the Leiden University Medical Center between 1983 and 2002. The resulting model describes the kidney transplantation data very well and provides better predictions of the time to failure than other joint and survival models.  相似文献   
2.
Summary.  Hypoelliptic diffusion processes can be used to model a variety of phenomena in applications ranging from molecular dynamics to audio signal analysis. We study parameter estimation for such processes in situations where we observe some components of the solution at discrete times. Since exact likelihoods for the transition densities are typically not known, approximations are used that are expected to work well in the limit of small intersample times Δ t and large total observation times N  Δ t . Hypoellipticity together with partial observation leads to ill conditioning requiring a judicious combination of approximate likelihoods for the various parameters to be estimated. We combine these in a deterministic scan Gibbs sampler alternating between missing data in the unobserved solution components, and parameters. Numerical experiments illustrate asymptotic consistency of the method when applied to simulated data. The paper concludes with an application of the Gibbs sampler to molecular dynamics data.  相似文献   
3.
Many applications require efficient sampling from Gaussian distributions. The method of choice depends on the dimension of the problem as well as the structure of the covariance- (Σ) or precision matrix (Q). The most common black-box routine for computing a sample is based on Cholesky factorization. In high dimensions, computing the Cholesky factor of Σ or Q may be prohibitive due to accumulation of more non-zero entries in the factor than is possible to store in memory. We compare different methods for computing the samples iteratively adapting ideas from numerical linear algebra. These methods assume that matrix vector products, Qv, are fast to compute. We show that some of the methods are competitive and faster than Cholesky sampling and that a parallel version of one method on a Graphical Processing Unit (GPU) using CUDA can introduce a speed-up of up to 30x. Moreover, one method is used to sample from the posterior distribution of petroleum reservoir parameters in a North Sea field, given seismic reflection data on a large 3D grid.  相似文献   
1
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号