首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   43篇
  免费   0篇
管理学   12篇
综合类   2篇
统计学   29篇
  2021年   1篇
  2019年   3篇
  2018年   2篇
  2017年   2篇
  2016年   1篇
  2014年   4篇
  2013年   4篇
  2012年   2篇
  2011年   1篇
  2008年   1篇
  2007年   2篇
  2006年   2篇
  2005年   1篇
  2003年   3篇
  2002年   1篇
  1999年   1篇
  1998年   2篇
  1995年   2篇
  1993年   1篇
  1992年   1篇
  1989年   3篇
  1988年   2篇
  1985年   1篇
排序方式: 共有43条查询结果,搜索用时 31 毫秒
31.
Multi-state Models in Epidemiology   总被引:1,自引:0,他引:1  
I first discuss the main assumptions which can be made for multi-state models: the time-homogeneity and semi-Markov assumptions, the problem of choice of the time scale, the assumption of homogeneity of the population and also assumptions about the way the observations are incomplete, leading to truncation and censoring. The influence of covariates and different durations and time-dependent variables are synthesized using explanatory processes, and a general additive model for transition intensities presented. Different inference approaches, including penalized likelihood, are considered. Finally three examples of application in epidemiology are presented and some references to other works are given.  相似文献   
32.
Quantitative Assessment of Building Fire Risk to Life Safety   总被引:1,自引:0,他引:1  
This article presents a quantitative risk assessment framework for evaluating fire risk to life safety. Fire risk is divided into two parts: probability and corresponding consequence of every fire scenario. The time-dependent event tree technique is used to analyze probable fire scenarios based on the effect of fire protection systems on fire spread and smoke movement. To obtain the variation of occurrence probability with time, Markov chain is combined with a time-dependent event tree for stochastic analysis on the occurrence probability of fire scenarios. To obtain consequences of every fire scenario, some uncertainties are considered in the risk analysis process. When calculating the onset time to untenable conditions, a range of fires are designed based on different fire growth rates, after which uncertainty of onset time to untenable conditions can be characterized by probability distribution. When calculating occupant evacuation time, occupant premovement time is considered as a probability distribution. Consequences of a fire scenario can be evaluated according to probability distribution of evacuation time and onset time of untenable conditions. Then, fire risk to life safety can be evaluated based on occurrence probability and consequences of every fire scenario. To express the risk assessment method in detail, a commercial building is presented as a case study. A discussion compares the assessment result of the case study with fire statistics.  相似文献   
33.
34.
Wong et al. [(2018), ‘Piece-wise Proportional Hazards Models with Interval-censored Data’, Journal of Statistical Computation and Simulation, 88, 140–155] studied the piecewise proportional hazards (PWPH) model with interval-censored (IC) data under the distribution-free set-up. It is well known that the partial likelihood approach is not applicable for IC data, and Wong et al. (2018) showed that the standard generalised likelihood approach does not work either. They proposed the maximum modified generalised likelihood estimator (MMGLE) and the simulation results suggest that the MMGLE is consistent. We establish the consistency and asymptotically normality of the MMGLE.  相似文献   
35.
Current survival techniques do not provide a good method for handling clinical trials with a large percent of censored observations. This research proposes using time-dependent surrogates of survival as outcome variables, in conjunction with observed survival time, to improve the precision in comparing the relative effects of two treatments on the distribution of survival time. This is in contrast to the standard method used today which uses the marginal density of survival time, T. only, or the marginal density of a surrogate, X, only, therefore, ignoring some available information. The surrogate measure, X, may be a fixed value or a time-dependent variable, X(t). X is a summary measure of some of the covariates measured throughout the trial that provide additional information on a subject's survival time. It is possible to model these time-dependent covariate values and relate the parameters in the model to the parameters in the distribution of T given X. The result is that three new models are available for the analysis of clinical trials. All three models use the joint density of survival time and a surrogate measure. Given one of three different assumed mechanisms of the potential treatment effect, each of the three methods improves the precision of the treatment estimate.  相似文献   
36.
Survival studies usually collect on each participant, both duration until some terminal event and repeated measures of a time-dependent covariate. Such a covariate is referred to as an internal time-dependent covariate. Usually, some subjects drop out of the study before occurence of the terminal event of interest. One may then wish to evaluate the relationship between time to dropout and the internal covariate. The Cox model is a standard framework for that purpose. Here, we address this problem in situations where the value of the covariate at dropout is unobserved. We suggest a joint model which combines a first-order Markov model for the longitudinaly measured covariate with a time-dependent Cox model for the dropout process. We consider maximum likelihood estimation in this model and show how estimation can be carried out via the EM-algorithm. We state that the suggested joint model may have applications in the context of longitudinal data with nonignorable dropout. Indeed, it can be viewed as generalizing Diggle and Kenward's model (1994) to situations where dropout may occur at any point in time and may be censored. Hence we apply both models and compare their results on a data set concerning longitudinal measurements among patients in a cancer clinical trial.  相似文献   
37.
We present a mathematical treatment of a two-mutation model for carcinogenesis with time-dependent parameters. This model has previously been shown to be consistent with epidemiologic and experimental data. An approximate hazard function used in previous papers is critically evaluated.  相似文献   
38.
There has been much recent work on Bayesian approaches to survival analysis, incorporating features such as flexible baseline hazards, time-dependent covariate effects, and random effects. Some of the proposed methods are quite complicated to implement, and we argue that as good or better results can be obtained via simpler methods. In particular, the normal approximation to the log-gamma distribution yields easy and efficient computational methods in the face of simple multivariate normal priors for baseline log-hazards and time-dependent covariate effects. While the basic method applies to piecewise-constant hazards and covariate effects, it is easy to apply importance sampling to consider smoother functions.  相似文献   
39.
Previous applications of carcinogenic risk assessment using mathematical models of carcinogenesis have focused largely on the case where the level of exposure remains constant over time. In many situations, however, the dose of the carcinogen varies with time. In this paper, we discuss both the classical Armitage-Doll multistage model and the Moolgavkar-Venzon-Knudson two-stage birth-death-mutation model with time-dependent dosing regimens. Bounds on the degree of underestimation of risk that can occur through the use of a simple time-weighted average dose are derived by means of comparison with an equivalent constant dose corresponding to the actual risk under the time-dependent dosing regimen.  相似文献   
40.
In some problems in survival analysis there may be more than one plausible measure of time for each individual. For example mileage may be a better indication of the age of a car than months. This paper considers the possibility of combining two (or more) time scales measured on each individual into a single scale. A collapsibility condition is proposed for regarding the combined scale as fully informative regarding survival. The resulting model may be regarded as a generalization of the usual accelerated life model that allows time-dependent covariates. Parametric methods for the choice of time scale, for testing the validity of the collapsibility assumption and for parametric inference about the failure distribution along the new scale are discussed. Two examples are used to illustrate the methods, namely Hyde's (1980) Channing House data and a large cohort mortality study of asbestos workers in Quebec.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号