首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   735篇
  免费   21篇
管理学   11篇
人口学   4篇
理论方法论   2篇
综合类   14篇
社会学   2篇
统计学   723篇
  2023年   1篇
  2022年   3篇
  2021年   4篇
  2020年   17篇
  2019年   38篇
  2018年   30篇
  2017年   59篇
  2016年   15篇
  2015年   20篇
  2014年   26篇
  2013年   220篇
  2012年   120篇
  2011年   20篇
  2010年   17篇
  2009年   17篇
  2008年   26篇
  2007年   11篇
  2006年   7篇
  2005年   21篇
  2004年   13篇
  2003年   13篇
  2002年   12篇
  2001年   11篇
  2000年   7篇
  1999年   2篇
  1998年   6篇
  1997年   1篇
  1996年   4篇
  1995年   4篇
  1994年   1篇
  1993年   1篇
  1992年   2篇
  1991年   1篇
  1988年   2篇
  1986年   1篇
  1985年   1篇
  1984年   1篇
  1975年   1篇
排序方式: 共有756条查询结果,搜索用时 15 毫秒
1.
The generalized half-normal (GHN) distribution and progressive type-II censoring are considered in this article for studying some statistical inferences of constant-stress accelerated life testing. The EM algorithm is considered to calculate the maximum likelihood estimates. Fisher information matrix is formed depending on the missing information law and it is utilized for structuring the asymptomatic confidence intervals. Further, interval estimation is discussed through bootstrap intervals. The Tierney and Kadane method, importance sampling procedure and Metropolis-Hastings algorithm are utilized to compute Bayesian estimates. Furthermore, predictive estimates for censored data and the related prediction intervals are obtained. We consider three optimality criteria to find out the optimal stress level. A real data set is used to illustrate the importance of GHN distribution as an alternative lifetime model for well-known distributions. Finally, a simulation study is provided with discussion.  相似文献   
2.
Abstract.  Recurrent event data are largely characterized by the rate function but smoothing techniques for estimating the rate function have never been rigorously developed or studied in statistical literature. This paper considers the moment and least squares methods for estimating the rate function from recurrent event data. With an independent censoring assumption on the recurrent event process, we study statistical properties of the proposed estimators and propose bootstrap procedures for the bandwidth selection and for the approximation of confidence intervals in the estimation of the occurrence rate function. It is identified that the moment method without resmoothing via a smaller bandwidth will produce a curve with nicks occurring at the censoring times, whereas there is no such problem with the least squares method. Furthermore, the asymptotic variance of the least squares estimator is shown to be smaller under regularity conditions. However, in the implementation of the bootstrap procedures, the moment method is computationally more efficient than the least squares method because the former approach uses condensed bootstrap data. The performance of the proposed procedures is studied through Monte Carlo simulations and an epidemiological example on intravenous drug users.  相似文献   
3.
It is well-known that, under Type II double censoring, the maximum likelihood (ML) estimators of the location and scale parameters, θ and δ, of a twoparameter exponential distribution are linear functions of the order statistics. In contrast, when θ is known, theML estimator of δ does not admit a closed form expression. It is shown, however, that theML estimator of the scale parameter exists and is unique. Moreover, it has good large-sample properties. In addition, sharp lower and upper bounds for this estimator are provided, which can serve as starting points for iterative interpolation methods such as regula falsi. Explicit expressions for the expected Fisher information and Cramér-Rao lower bound are also derived. In the Bayesian context, assuming an inverted gamma prior on δ, the uniqueness, boundedness and asymptotics of the highest posterior density estimator of δ can be deduced in a similar way. Finally, an illustrative example is included.  相似文献   
4.
Oiler, Gomez & Calle (2004) give a constant sum condition for processes that generate interval‐censored lifetime data. They show that in models satisfying this condition, it is possible to estimate non‐parametrically the lifetime distribution based on a well‐known simplified likelihood. The author shows that this constant‐sum condition is equivalent to the existence of an observation process that is independent of lifetimes and which gives the same probability distribution for the observed data as the underlying true process.  相似文献   
5.
Asthma patients' health status may be especially sensitive to some types of air pollution, but the evidence on this is mixed. We explore the effects of ground-level ozone on asthma patient's activities, breaking apart the usual aggregated category of leisure into indoor and outdoor activities, and differentiating those by whether the activities were active or inactive. Applying the semiparametric censored estimation method we demonstrate that even though the period over which activities were observed was relatively low in ozone levels, there is a significant impact of ozone on a few activities. The (non-ozone) economic and demographic variables in the model play significant roles in explaining the allocation of time among seven activities, suggesting the suitability of the approach for other household decision-making contexts.  相似文献   
6.
Abstract A model is introduced here for multivariate failure time data arising from heterogenous populations. In particular, we consider a situation in which the failure times of individual subjects are often temporally clustered, so that many failures occur during a relatively short age interval. The clustering is modelled by assuming that the subjects can be divided into ‘internally homogenous’ latent classes, each such class being then described by a time‐dependent frailty profile function. As an example, we reanalysed the dental caries data presented earlier in Härkänen et al. [Scand. J. Statist. 27 (2000) 577], as it turned out that our earlier model could not adequately describe the observed clustering.  相似文献   
7.
Consider a randomized trial in which time to the occurrence of a particular disease, say pneumocystis pneumonia in an AIDS trial or breast cancer in a mammographic screening trial, is the failure time of primary interest. Suppose that time to disease is subject to informative censoring by the minimum of time to death, loss to and end of follow-up. In such a trial, the censoring time is observed for all study subjects, including failures. In the presence of informative censoring, it is not possible to consistently estimate the effect of treatment on time to disease without imposing additional non-identifiable assumptions. The goals of this paper are to specify two non-identifiable assumptions that allow one to test for and estimate an effect of treatment on time to disease in the presence of informative censoring. In a companion paper (Robins, 1995), we provide consistent and reasonably efficient semiparametric estimators for the treatment effect under these assumptions. In this paper we largely restrict attention to testing. We propose tests that, like standard weighted-log-rank tests, are asymptotically distribution-free -level tests under the null hypothesis of no causal effect of treatment on time to disease whenever the censoring and failure distributions are conditionally independent given treatment arm. However, our tests remain asymptotically distribution-free -level tests in the presence of informative censoring provided either of our assumptions are true. In contrast, a weighted log-rank test will be an -level test in the presence of informative censoring only if (1) one of our two non-identifiable assumptions hold, and (2) the distribution of time to censoring is the same in the two treatment arms. We also extend our methods to studies of the effect of a treatment on the evolution over time of the mean of a repeated measures outcome, such as CD-4 count.  相似文献   
8.
This paper considers the design of accelerated life test (ALT) sampling plans under Type I progressive interval censoring with random removals. We assume that the lifetime of products follows a Weibull distribution. Two levels of constant stress higher than the use condition are used. The sample size and the acceptability constant that satisfy given levels of producer's risk and consumer's risk are found. In particular, the optimal stress level and the allocation proportion are obtained by minimizing the generalized asymptotic variance of the maximum likelihood estimators of the model parameters. Furthermore, for validation purposes, a Monte Carlo simulation is conducted to assess the true probability of acceptance for the derived sampling plans.  相似文献   
9.
10.
In this paper the Bayesian analysis of incomplete categorical data under informative general censoring proposed by Paulino and Pereira (1995) is revisited. That analysis is based on Dirichlet priors and can be applied to any missing data pattern. However, the known properties of the posterior distributions are scarce and therefore severe limitations to the posterior computations remain. Here is shown how a Monte Carlo simulation approach based on an alternative parameterisation can be used to overcome the former computational difficulties. The proposed simulation approach makes available the approximate estimation of general parametric functions and can be implemented in a very straightforward way.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号