首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   30篇
  免费   0篇
统计学   30篇
  2021年   1篇
  2020年   1篇
  2019年   1篇
  2018年   2篇
  2017年   3篇
  2016年   1篇
  2014年   2篇
  2013年   10篇
  2012年   1篇
  2010年   1篇
  2008年   2篇
  2004年   1篇
  2003年   1篇
  2002年   1篇
  2001年   1篇
  1998年   1篇
排序方式: 共有30条查询结果,搜索用时 15 毫秒
1.
Summary. Earthquake intensities are modelled as a function of previous activity whose specific form is based on established empirical laws in seismology, but whose parameter values can vary from place to place. This model is used for characterizing regional features of seismic activities in and around Japan, and also for exploring regions where the actual seismicity rate systematically deviates from that of the modelled rate.  相似文献   
2.
This article explores the calculation of tolerance limits for the Poisson regression model based on the profile likelihood methodology and small-sample asymptotic corrections to improve the coverage probability performance. The data consist of n counts, where the mean or expected rate depends upon covariates via the log regression function. This article evaluated upper tolerance limits as a function of covariates. The upper tolerance limits are obtained from upper confidence limits of the mean. To compute upper confidence limits the following methodologies were considered: likelihood based asymptotic methods, small-sample asymptotic methods to improve the likelihood based methodology, and the delta method. Two applications are discussed: one application relating to defects in semiconductor wafers due to plasma etching and the other examining the number of surface faults in upper seams of coal mines. All three methodologies are illustrated for the two applications.  相似文献   
3.
Optimal designs for estimating the parameters and also the optimum factor combinations in multiresponse experiments have been considered by various authors. However, till date, in mixture experiments optimum designs have been studied only in the single response case. In this article, attempt has been made to investigate optimum designs for estimating optimum mixing proportions in a multiresponse mixture experiment.  相似文献   
4.
Longitudinal data frequently arises in various fields of applied sciences where individuals are measured according to some ordered variable, e.g. time. A common approach used to model such data is based on the mixed models for repeated measures. This model provides an eminently flexible approach to modeling of a wide range of mean and covariance structures. However, such models are forced into a rigidly defined class of mathematical formulas which may not be well supported by the data within the whole sequence of observations. A possible non-parametric alternative is a cubic smoothing spline, which is highly flexible and has useful smoothing properties. It can be shown that under normality assumption, the solution of the penalized log-likelihood equation is the cubic smoothing spline, and this solution can be further expressed as a solution of the linear mixed model. It is shown here how cubic smoothing splines can be easily used in the analysis of complete and balanced data. Analysis can be greatly simplified by using the unweighted estimator studied in the paper. It is shown that if the covariance structure of random errors belong to certain class of matrices, the unweighted estimator is the solution to the penalized log-likelihood function. This result is new in smoothing spline context and it is not only confined to growth curve settings. The connection to mixed models is used in developing a rough testing of group profiles. Numerical examples are presented to illustrate the techniques proposed.  相似文献   
5.
An improved likelihood-based method based on Fraser et al. (1999) is proposed in this paper to test the significance of the second lag of the stationary AR(2) model. Compared with the test proposed by Fan and Yao (2003) and the signed log-likelihood ratio test, the proposed method has remarkable accuracy. Simulation studies are performed to illustrate the accuracy of the proposed method. Application of the proposed method on historical data is presented to demonstrate the implementation of this method. Furthermore, the method can be extended to the general AR(p) model.  相似文献   
6.
The method of likelihood imputation is devised under the framework of latent structure models where the observation is a statistic of the complete data which can only be specified on a latent basis. The imputed data set is chosen to differ least from the observed one in their information contents—a concept with general implications for the analysis of incomplete-data. In contrast to the standard conditional-mean single imputation, our procedure depends on an entire likelihood region instead of any single point in it, and yields consistent parameter estimators nevertheless. We explain its implementations and illustrate with data from panel surveys and linear regression with censorship. We also discuss its potentials in sensitivity analysis  相似文献   
7.
Afify et al. [The Weibull Fréchet distribution and its applications, J. Appl. Stat., 43 (2016), pp. 2608–2626] defined and studied a new four-parameter lifetime model called the Weibull Fréchet distribution. They made some mistakes in presenting the log-likelihood function and the components of score vector. In this note, we will correct them.  相似文献   
8.
Selecting an appropriate structure for a linear mixed model serves as an appealing problem in a number of applications such as in the modelling of longitudinal or clustered data. In this paper, we propose a variable selection procedure for simultaneously selecting and estimating the fixed and random effects. More specifically, a profile log-likelihood function, along with an adaptive penalty, is utilized for sparse selection. The Newton-Raphson optimization algorithm is performed to complete the parameter estimation. By jointly selecting the fixed and random effects, the proposed approach increases selection accuracy compared with two-stage procedures, and the usage of the profile log-likelihood can improve computational efficiency in one-stage procedures. We prove that the proposed procedure enjoys the model selection consistency. A simulation study and a real data application are conducted for demonstrating the effectiveness of the proposed method.  相似文献   
9.
ABSTRACT

Local likelihood has been mainly developed from an asymptotic point of view, with little attention to finite sample size issues. The present paper provides simulation evidence of how likelihood density estimation practically performs from two points of view. First, we explore the impact of the normalization step of the final estimate, second we show the effectiveness of higher order fits in identifying modes present in the population when small sample sizes are available. We refer to circular data, nevertheless it is easily seen that our findings straightforwardly extend to the Euclidean setting, where they appear to be somehow new.  相似文献   
10.
The t distribution has proved to be a useful alternative to the normal distribution especially When robust estimation is desired. We consider the multivariate nonlinear Student-t regression model and show that the biased of the estimates of the regression coefficients can be computed from an auxiliary generalized linear regression. We give a formula for the biases of the estimates of the parameters in the scale matrix, which also can be computed by means of a generalized linear regression. We briefly discuss some important special cases and present simulation results which indicate that our bias-corrected estimates outperform the uncorrected ones in small samples.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号