首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   54篇
  免费   0篇
理论方法论   1篇
社会学   2篇
统计学   51篇
  2020年   1篇
  2018年   3篇
  2017年   7篇
  2016年   2篇
  2014年   1篇
  2013年   23篇
  2012年   2篇
  2011年   1篇
  2010年   1篇
  2009年   2篇
  2007年   2篇
  2006年   2篇
  2005年   1篇
  2004年   1篇
  2002年   1篇
  1999年   1篇
  1995年   1篇
  1993年   1篇
  1991年   1篇
排序方式: 共有54条查询结果,搜索用时 15 毫秒
1.
In this article, a one-sample procedure for multiple comparisons of exponential location parameters with a control under heteroscedasticity is proposed. The observations are obtained by doubly censored samples. A one-sided and two-sided confidence intervals are used to perform such multiple comparisons. Statistical tables of critical values and an example of comparing four drugs in treating leukemia are provided.  相似文献   
2.
It is shown that the locally best invariant test for the existence of outliers for scale parameters of the gamma distribution is given by Bartholomew's test for exponentiality which is the ratio of the sum of squares of the data to the square of the sample mean. The optimality robustness, including null and nonnull robustness of the test is shown. A small simulation study to compare the power among the other eight competitive tests for testing exponentiality is performed. It is seen that the locally best invariant test is not always best but is reasonably good. It is slightly better than Cochran's test and suffers less from the limiting masking effect.  相似文献   
3.
Doubly adaptive biased coin design (DBCD) is an important family of response-adaptive randomization procedures for clinical trials. It uses sequentially updated estimation to skew the allocation probability to favor the treatment that has performed better thus far. An important assumption for the DBCD is the homogeneity assumption for the patient responses. However, this assumption may be violated in many sequential experiments. Here we prove the robustness of the DBCD against certain time trends in patient responses. Strong consistency and asymptotic normality of the design are obtained under some widely satisfied conditions. Also, we propose a general weighted likelihood method to reduce the bias caused by the heterogeneity in the inference after a trial. Some numerical studies are also presented to illustrate the finite sample properties of DBCD.  相似文献   
4.
A doubly stochastic measure (DSM) is a measure μ on the unit square so that μ([0, 1] × A) = μ(A × [0, 1]) = m(A) where m is Lebesgue measure. The set of DSMs forms a convex set in the space of measures. It is known that DSMs supported on the union of two graphs of invertible functions are extreme points of that convex set (Seethoff and Shiflett, 1977/78 Seethoff, T.L., Shiflett. (1977/78). Doubly stochastic measures with prescribed support. Z. Wahrscheinlichkeitstheorie und Verw. Gebiete 41(4):283288.[Crossref], [Web of Science ®] [Google Scholar]). In general, there are few examples of extreme points in the literature. There are examples of so-called hairpins where the functions involved are inverses of each other, but there are also examples of the union of the graphs of a function and its inverse does not support a DSM (Sherwood and Taylor, 1988 Sherwood, H., Taylor, M.D. (1988). Doubly stochastic measures with hairpin support. Probab. Theory Related Fields 78(4):617626.[Crossref], [Web of Science ®] [Google Scholar]). In this paper, for a function f in a certain class, we find companion functions g so that the union of the graphs of f and g support a DSM even though the union of the graphs of f and f-inverse do not.  相似文献   
5.
Hea-Jung Kim 《Statistics》2013,47(5):421-441
This article develops a class of the weighted normal distributions for which the probability density function has the form of a product of a normal density and a weight function. The class constitutes marginal distributions obtained from various kinds of doubly truncated bivariate normal distributions. This class of distributions strictly includes the normal, skew–normal and two-piece skew–normal and is useful for selection modelling and inequality constrained normal mean analysis. Some distributional properties and Bayesian perspectives of the class are given. Probabilistic representation of the distributions is also given. The representation is shown to be straightforward to specify distribution and to implement computation, with output readily adapted for required analysis. Necessary theories and illustrative examples are provided.  相似文献   
6.
The Bayes estimators of the Gini index, the mean income and the proportion of the population living below a prescribed income level are obtained in this paper on the basis of censored income data from a pareto income distribution. The said estimators are obtained under the assumptions of a two-parameter exponential prior distribution and the usual squared error loss function. This work is also extended to the case when the income data are grouped and the exact incomes for the individuals in the population are not available. The method for the assessment of the hyperparameters is also outlined. Finally, the results are generalized for the doubly truncated gamma prior distribution. Now deceased.  相似文献   
7.
Doubly periodic non-homogeneous Poisson models for hurricane data   总被引:3,自引:1,他引:2  
Non-homogeneous Poisson processes with periodic claim intensity rate have been proposed as claim counts in risk theory. Here a doubly periodic Poisson model with short- and long-term trends is studied. Beta-type intensity functions are presented as illustrations. The likelihood function and the maximum likelihood estimates of the model parameters are derived.Doubly periodic Poisson models are appropriate when the seasonality does not repeat exactly the same short-term pattern every year, but has a peak intensity that varies over a longer period. This reflects periodic environments like those forming hurricanes, in alternating El Niño/La Niña years. An application of the model to the data set of Atlantic hurricanes affecting the United States (1899–2000) is discussed in detail.  相似文献   
8.
The paper analyses the distribution of times from HIV seroconversion to the first AIDS defining illness for a subcohort of the Western Australian HIV Cohort Study for whom the seroconversion date is known to fall within a calendar time window. The analysis is based on a generalised gamma model for the incubation times and a piecewise constant distribution for the conditional times of seroconversion given the seroconversion windows. This allows flexible hazard shapes and also allows comparison of goodness of fit of the gamma and Weibull distributions which are often used for modelling incubation times. Computational issues are discussed. In these data, neither age at seroconversion, nor calendar time of seroconversion, nor the identification of a seroconversion illness appears to afFect incubation distributions. The Weibull distribution appears to provide a reasonable fit. The distribution of times from seroconversion to an HIV-related death is also briefly considered.  相似文献   
9.
This article considers a problem of normal based two group classification when the groups are artificially dichotomized by a screening variable. Each group distribution is derived and the best regions for the classification are obtained. These derivations yield yet another classification rule. The rule is studied from several aspects such as the distribution of the rule, the optimal error rate, and the testing of a hypothesis. This article gives relationships among these aspects along with the investigation of the performance of the rule. The classification method and ideas are illustrated in detail with two examples.  相似文献   
10.
Left censoring concept has been defined in different ways in statistical applications. Turnbull (1974 Turnbull , B. W. ( 1974 ). Nonparametric estimation of a survivorship function with doubly censored data . J. Amer. Statist. Assoc. 69 : 169173 .[Taylor & Francis Online], [Web of Science ®] [Google Scholar]) defines it in a particular way. Whereas in recent literature, especially in epidemiological studies, it has been defined in another way. This difference between the two approaches is the main reason that despite simplicity, Turnbull method cannot be applicable in all cases of doubly censored data. In this article we present a modified Turnbull method for analysis of doubly censored data adequate with recent definition. Comparison has been done with other statistical methods, including imputation estimator, full likelihood-based and conditional likelihood-based approach using Iranian HIV data.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号