首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   24篇
  免费   0篇
管理学   2篇
社会学   2篇
统计学   20篇
  2023年   2篇
  2020年   2篇
  2019年   1篇
  2017年   1篇
  2016年   2篇
  2015年   1篇
  2014年   1篇
  2013年   4篇
  2011年   2篇
  2009年   1篇
  2007年   1篇
  2003年   1篇
  2002年   1篇
  1999年   1篇
  1998年   1篇
  1997年   2篇
排序方式: 共有24条查询结果,搜索用时 15 毫秒
1.
This paper deals with the analysis of multivariate survival data from a Bayesian perspective using Markov-chain Monte Carlo methods. The Metropolis along with the Gibbs algorithm is used to calculate some of the marginal posterior distributions. A multivariate survival model is proposed, since survival times within the same group are correlated as a consequence of a frailty random block effect. The conditional proportional-hazards model of Clayton and Cuzick is used with a martingale structured prior process (Arjas and Gasbarra) for the discretized baseline hazard. Besides the calculation of the marginal posterior distributions of the parameters of interest, this paper presents some Bayesian EDA diagnostic techniques to detect model adequacy. The methodology is exemplified with kidney infection data where the times to infections within the same patients are expected to be correlated.  相似文献   
2.
A robust process minimises the effect of the noise factors on the performance of a product or process. The variation of the performance of a robust process can be measured through modelling and analysis of process robustness. In this paper, a comprehensive methodology for modelling and analysis of process robustness is developed considering a number of relevant tools and techniques such as multivariate regression, control charting and simulation within the broad framework of Taguchi method. The methodology as developed considers, in specific terms, process modelling using historical data pertaining to responses, inputs variables and parameters as well as simulated noise variables data, identification of the model responses at each experimental setting of the controllable variables, estimation of multivariate process capability indices and control of their variability using control charting for determining optimal settings of the process variables using design of experiment-based Taguchi Method. The methodology is applied to a centrifugal casting process that produces worm-wheels for steam power plants in view of its critical importance of maintaining consistent performance in various under controllable situations (input conditions). The results show that the process settings as determined ensure minimum in-control variability with maximum performance of the centrifugal casting process, indicating improved level of robustness.  相似文献   
3.
Frequently in the analysis of survival data, survival times within the same group are correlated due to unobserved co-variates. One way these co-variates can be included in the model is as frailties. These frailty random block effects generate dependency between the survival times of the individuals which are conditionally independent given the frailty. Using a conditional proportional hazards model, in conjunction with the frailty, a whole new family of models is introduced. By considering a gamma frailty model, often the issue is to find an appropriate model for the baseline hazard function. In this paper a flexible baseline hazard model based on a correlated prior process is proposed and is compared with a standard Weibull model. Several model diagnostics methods are developed and model comparison is made using recently developed Bayesian model selection criteria. The above methodologies are applied to the McGilchrist and Aisbett (1991) kidney infection data and the analysis is performed using Markov Chain Monte Carlo methods. This revised version was published online in July 2006 with corrections to the Cover Date.  相似文献   
4.
5.
6.
The Qos and Qm are two leading estimators of the probability of misclassification which are based on the asymptotic expansion of the the expected value of the Error Rate, Pi. The estimators are, however, not suitable for estimating the Error rates for certain ranges of the parameters p , n1, n2 and ß.We investigate the regions in which they produce unacceptable estimates , and show that the Qos is, in general, better than the Qm in producing acceptable estimates  相似文献   
7.
We consider settings where it is of interest to fit and assess regression submodels that arise as various explanatory variables are excluded from a larger regression model. The larger model is referred to as the full model; the submodels are the reduced models. We show that a computationally efficient approximation to the regression estimates under any reduced model can be obtained from a simple weighted least squares (WLS) approach based on the estimated regression parameters and covariance matrix from the full model. This WLS approach can be considered an extension to unbiased estimating equations of a first-order Taylor series approach proposed by Lawless and Singhal. Using data from the 2010 Nationwide Inpatient Sample (NIS), a 20% weighted, stratified, cluster sample of approximately 8 million hospital stays from approximately 1000 hospitals, we illustrate the WLS approach when fitting interval censored regression models to estimate the effect of type of surgery (robotic versus nonrobotic surgery) on hospital length-of-stay while adjusting for three sets of covariates: patient-level characteristics, hospital characteristics, and zip-code level characteristics. Ordinarily, standard fitting of the reduced models to the NIS data takes approximately 10 hours; using the proposed WLS approach, the reduced models take seconds to fit.  相似文献   
8.
In this paper we introduce a new type-II progressive censoring scheme for two samples. It is observed that the proposed censoring scheme is analytically more tractable than the existing joint progressive type-II censoring scheme proposed by Rasouli and Balakrishnan. The maximum likelihood estimators of the unknown parameters are obtained and their exact distributions are derived. Based on the exact distributions of the maximum likelihood estimators exact confidence intervals are also constructed. For comparison purposes we have used bootstrap confidence intervals also. One data analysis has been performed for illustrative purposes. Finally we propose some open problems.  相似文献   
9.
10.
Summary. We propose a new parametric survival model for cancer prevention studies. The formulation of the model is in the spirit of stochastic modelling of the occurrences of tumours through two stages: initiation of an undetected tumour and promotion of the tumour to a detectable cancer. Several novel properties of the model proposed are derived. In addition, we examine the relationship of our model with the existing lagged regression model of Zucker and Lakatos. Also, we bridge the difference between two distinct stochastic modelling methods for cancer data, one used primarily for cancer therapeutic trials and the other used for cancer prevention trials.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号